Menu Top
Additional Questions for each Class with Solution
6th 7th 8th 9th 10th 11th 12th

Class 12th Chapters
1. Relations and Functions 2. Inverse Trigonometric Functions 3. Matrices
4. Determinants 5. Continuity and Differentiability 6. Application of Derivatives
7. Integrals 8. Application of Integrals 9. Differential Equations
10. Vector Algebra 11. Three Dimensional Geometry 12. Linear Programming
13. Probability

Content On This Page
Objective Type Questions Short Answer Type Questions Long Answer Type Questions


Chapter 3 Matrices (Additional Questions)

Welcome to this essential supplementary practice section dedicated to Matrices, a powerful mathematical tool introduced comprehensively in your Class 12 studies. Matrices provide a concise and efficient way to represent and manipulate large blocks of information, particularly linear systems, and they form the bedrock of linear algebra with vast applications in computer graphics, physics, economics, and data science. While the core chapter introduces the fundamental definitions, types of matrices, algebraic operations, and the crucial concept of the inverse, this collection of additional questions is designed to push your understanding further through more complex calculations, intricate proofs involving matrix properties, and challenging applications, ensuring you achieve true fluency and mastery.

Recall the foundational concepts: a matrix is a rectangular array of numbers or functions arranged in rows and columns. You learned about various types (row, column, square, diagonal, scalar, identity, zero matrices) and the core operations:

You also explored the Transpose of a matrix ($A^T$ or $A'$), obtained by interchanging rows and columns, and its properties like $(A+B)^T = A^T + B^T$ and $(AB)^T = B^T A^T$. This led to defining Symmetric ($A^T = A$) and Skew-Symmetric ($A^T = -A$) matrices, and the fact that any square matrix can be uniquely expressed as the sum of a symmetric and a skew-symmetric matrix. Finally, the chapter introduced Elementary Row (or Column) Operations and their crucial application in finding the Inverse ($A^{-1}$) of an invertible (non-singular) square matrix, characterized by $AA^{-1} = A^{-1}A = I$ (where $I$ is the identity matrix).

This supplementary section significantly elevates the challenge. Expect problems involving the multiplication of larger order matrices or demanding multi-step calculations combining addition, scalar multiplication, and matrix multiplication, rigorously testing your understanding of the order of operations and properties like associativity and distributivity. A strong emphasis is placed on proof-based questions. You will be required to prove various properties related to the transpose (e.g., $(ABC)^T = C^T B^T A^T$), symmetric/skew-symmetric matrices (e.g., proving $A+A^T$ is symmetric), or consequences of matrix multiplication and invertibility, such as the important result $(AB)^{-1} = B^{-1}A^{-1}$ for invertible matrices $A$ and $B$.

Extensive practice is provided for finding the inverse of $3 \times 3$ matrices using elementary row (or column) operations, a methodical but sometimes lengthy process demanding high accuracy. These might involve matrices containing parameters, adding an algebraic dimension. You will also tackle solving various matrix equations, such as finding an unknown matrix $X$ satisfying $AX = B$ or $XA = B$ (requiring pre- or post-multiplication by $A^{-1}$, if it exists). While solving systems of linear equations using matrix inverses or Gaussian elimination is often a separate topic or chapter, introductory problems here might use matrix multiplication to model simple systems or transformations. You might also explore properties of special matrix types like idempotent ($A^2=A$) or nilpotent ($A^k=0$ for some $k$) matrices. Engaging thoroughly with this rigorous practice is essential for mastering matrix operations, deepening your understanding of fundamental matrix properties through proofs, building proficiency and accuracy in finding inverses using elementary operations, and laying the groundwork for advanced applications in solving systems of linear equations, linear transformations, and further studies in linear algebra.



Objective Type Questions

Question 1. If a matrix has 18 elements, which of the following is NOT a possible order for the matrix?

(A) $1 \times 18$

(B) $2 \times 9$

(C) $3 \times 6$

(D) $4 \times 5$

Answer:

The number of elements in a matrix is given by the product of its number of rows and number of columns. If a matrix has order $m \times n$, then it has $mn$ elements.

Given that the matrix has 18 elements, the product of its number of rows ($m$) and columns ($n$) must be equal to 18.

We need to find which of the given options represents an order $(m \times n)$ such that $m \times n$ is not equal to 18.

The possible pairs of positive integer factors of 18 are (1, 18), (2, 9), (3, 6), (6, 3), (9, 2), and (18, 1).

These pairs correspond to the possible orders for a matrix with 18 elements: $1 \times 18$, $2 \times 9$, $3 \times 6$, $6 \times 3$, $9 \times 2$, and $18 \times 1$.

Let's examine the given options:

(A) $1 \times 18$: The product is $1 \times 18 = 18$. This is a possible order.

(B) $2 \times 9$: The product is $2 \times 9 = 18$. This is a possible order.

(C) $3 \times 6$: The product is $3 \times 6 = 18$. This is a possible order.

(D) $4 \times 5$: The product is $4 \times 5 = 20$. Since $20 \neq 18$, this is not a possible order for a matrix with 18 elements.

Therefore, the order $4 \times 5$ is NOT a possible order for a matrix with 18 elements.


The correct option is (D) $4 \times 5$.

Question 2. Construct a $2 \times 2$ matrix $A = [a_{ij}]$ where $a_{ij} = |i - j|$.

(A) $\begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}$

(B) $\begin{bmatrix} 0 & -1 \\ -1 & 0 \end{bmatrix}$

(C) $\begin{bmatrix} 0 & 1 \\ 2 & 0 \end{bmatrix}$

(D) $\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$

Answer:

A $2 \times 2$ matrix $A$ has the form $A = \begin{bmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \end{bmatrix}$.

The elements of the matrix are given by the formula $a_{ij} = |i - j|$, where $i$ represents the row number and $j$ represents the column number.

We need to calculate each element of the $2 \times 2$ matrix:

For the element in the first row, first column ($i=1, j=1$):

$a_{11} = |1 - 1| = |0| = 0$

For the element in the first row, second column ($i=1, j=2$):

$a_{12} = |1 - 2| = |-1| = 1$

For the element in the second row, first column ($i=2, j=1$):

$a_{21} = |2 - 1| = |1| = 1$

For the element in the second row, second column ($i=2, j=2$):

$a_{22} = |2 - 2| = |0| = 0$

Substituting these values into the $2 \times 2$ matrix form, we get:

$A = \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}$

Comparing this result with the given options, we find that it matches option (A).


The correct option is (A) $\begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}$.

Question 3. Which of the following matrices is a row matrix?

(A) $\begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}$

(B) $\begin{bmatrix} 1 & 2 & 3 \end{bmatrix}$

(C) $\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$

(D) $\begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$

Answer:

A row matrix is a matrix that has only one row, regardless of the number of columns.

Let's analyze the order (number of rows $\times$ number of columns) of each given matrix:

(A) $\begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}$: This matrix has 3 rows and 1 column. Its order is $3 \times 1$. This is a column matrix.

(B) $\begin{bmatrix} 1 & 2 & 3 \end{bmatrix}$: This matrix has 1 row and 3 columns. Its order is $1 \times 3$. Since it has only one row, this is a row matrix.

(C) $\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$: This matrix has 2 rows and 2 columns. Its order is $2 \times 2$. This is a square matrix (specifically, an identity matrix).

(D) $\begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$: This matrix has 2 rows and 2 columns. Its order is $2 \times 2$. This is a square matrix (specifically, a null or zero matrix).

Based on the definition, the matrix in option (B) is a row matrix as it has only one row.


The correct option is (B) $\begin{bmatrix} 1 & 2 & 3 \end{bmatrix}$.

Question 4. If $A$ and $B$ are two matrices such that $A+B$ is defined, which of the following must be true about their orders?

(A) A and B have the same number of rows.

(B) A and B have the same number of columns.

(C) A and B have the same order (same number of rows and columns).

(D) Number of columns in A equals number of rows in B.

Answer:

For the addition of two matrices $A$ and $B$ to be defined, the matrices must have the same order. This means that both matrices must have the same number of rows and the same number of columns.

Let the order of matrix $A$ be $m \times n$ and the order of matrix $B$ be $p \times q$.

For $A+B$ to be defined, we must have $m = p$ (same number of rows) and $n = q$ (same number of columns).

If the orders are the same, say $m \times n$, then the resulting matrix $A+B$ will also have the order $m \times n$, and its elements $(A+B)_{ij}$ are given by the sum of the corresponding elements of $A$ and $B$, i.e., $(A+B)_{ij} = a_{ij} + b_{ij}$. This element-wise addition is only possible if the elements are in the same position in matrices of the same size.

Let's look at the options:

(A) A and B have the same number of rows: This is a necessary condition, but not sufficient on its own for addition.

(B) A and B have the same number of columns: This is also a necessary condition, but not sufficient on its own for addition.

(C) A and B have the same order (same number of rows and columns): This means $m=p$ and $n=q$. This is the complete and correct condition for matrix addition to be defined.

(D) Number of columns in A equals number of rows in B: This condition ($n=p$) is required for matrix multiplication $AB$ to be defined, not for matrix addition $A+B$.

Therefore, for $A+B$ to be defined, matrices A and B must have the same order.


The correct option is (C) A and B have the same order (same number of rows and columns).

Question 5. If $A = \begin{bmatrix} 1 & -1 \\ 2 & 0 \end{bmatrix}$ and $B = \begin{bmatrix} 3 & 1 \\ 0 & 4 \end{bmatrix}$, find $A-2B$.

(A) $\begin{bmatrix} -5 & -3 \\ 2 & -8 \end{bmatrix}$

(B) $\begin{bmatrix} -5 & -3 \\ 2 & 8 \end{bmatrix}$

(C) $\begin{bmatrix} 7 & 1 \\ 2 & 8 \end{bmatrix}$

(D) $\begin{bmatrix} 7 & -3 \\ 2 & -8 \end{bmatrix}$

Answer:

We are given two matrices $A = \begin{bmatrix} 1 & -1 \\ 2 & 0 \end{bmatrix}$ and $B = \begin{bmatrix} 3 & 1 \\ 0 & 4 \end{bmatrix}$. We need to find the matrix $A - 2B$.

First, we calculate the scalar multiplication of matrix $B$ by 2:

$2B = 2 \times \begin{bmatrix} 3 & 1 \\ 0 & 4 \end{bmatrix} = \begin{bmatrix} 2 \times 3 & 2 \times 1 \\ 2 \times 0 & 2 \times 4 \end{bmatrix} = \begin{bmatrix} 6 & 2 \\ 0 & 8 \end{bmatrix}$

Now, we perform the matrix subtraction $A - 2B$. To subtract matrices, we subtract the corresponding elements:

$A - 2B = \begin{bmatrix} 1 & -1 \\ 2 & 0 \end{bmatrix} - \begin{bmatrix} 6 & 2 \\ 0 & 8 \end{bmatrix}$

$A - 2B = \begin{bmatrix} 1 - 6 & -1 - 2 \\ 2 - 0 & 0 - 8 \end{bmatrix}$

$A - 2B = \begin{bmatrix} -5 & -3 \\ 2 & -8 \end{bmatrix}$

Comparing this result with the given options, we see that it matches option (A).


The correct option is (A) $\begin{bmatrix} -5 & -3 \\ 2 & -8 \end{bmatrix}$.

Question 6. If matrix $A$ is of order $m \times n$ and matrix $B$ is of order $n \times p$, what is the order of the matrix product $BA$?

(A) $m \times p$

(B) $p \times n$

(C) $p \times m$

(D) The product $BA$ is not defined.

Answer:

Let the order of matrix $A$ be $m \times n$. This means matrix $A$ has $m$ rows and $n$ columns.

Let the order of matrix $B$ be $n \times p$. This means matrix $B$ has $n$ rows and $p$ columns.

For the matrix product $BA$ to be defined, the number of columns in the first matrix ($B$) must be equal to the number of rows in the second matrix ($A$).

Number of columns in $B$ is $p$.

Number of rows in $A$ is $m$.

For the product $BA$ to be defined, we must have $p = m$.

The problem states that matrix $A$ is of order $m \times n$ and matrix $B$ is of order $n \times p$. It does not specify that $p$ is equal to $m$.

In the general case, where $p$ is not necessarily equal to $m$, the condition for matrix multiplication $BA$ to be defined is not met.

Therefore, the product $BA$ is not defined for arbitrary orders $m \times n$ and $n \times p$ where $p \neq m$.

If, however, the condition $p=m$ were met, then the order of the resulting matrix $BA$ would be (number of rows in $B$) $\times$ (number of columns in $A$), which is $n \times n$. None of the options (A), (B), or (C) represent the order $n \times n$.

Given the options and the general orders specified, the most accurate statement is that the product $BA$ is not defined in the general case.


The correct option is (D) The product $BA$ is not defined.

Question 7. If $A = \begin{bmatrix} 1 & 2 \end{bmatrix}$ and $B = \begin{bmatrix} 3 \\ 4 \end{bmatrix}$, find $AB$.

(A) $\begin{bmatrix} 3 & 4 \\ 6 & 8 \end{bmatrix}$

(B) $\begin{bmatrix} 11 \end{bmatrix}$

(C) $\begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}$

(D) $\begin{bmatrix} 7 \end{bmatrix}$

Answer:

We are given two matrices $A = \begin{bmatrix} 1 & 2 \end{bmatrix}$ and $B = \begin{bmatrix} 3 \\ 4 \end{bmatrix}$. We need to find the matrix product $AB$.

First, let's determine the order of each matrix:

Matrix $A$ has 1 row and 2 columns, so its order is $1 \times 2$.

Matrix $B$ has 2 rows and 1 column, so its order is $2 \times 1$.

For the product $AB$ to be defined, the number of columns in the first matrix ($A$) must equal the number of rows in the second matrix ($B$). The number of columns in $A$ is 2, and the number of rows in $B$ is 2. Since $2 = 2$, the product $AB$ is defined.

The order of the resulting matrix $AB$ will be (number of rows in $A$) $\times$ (number of columns in $B$), which is $1 \times 1$.

To find the elements of the product matrix $AB$, we multiply the elements of the row(s) of the first matrix by the corresponding elements of the column(s) of the second matrix and sum the products.

In this case, $AB$ is a $1 \times 1$ matrix, so it has only one element, $c_{11}$.

To find $c_{11}$, we take the first row of $A$ ($\begin{bmatrix} 1 & 2 \end{bmatrix}$) and the first column of $B$ ($\begin{bmatrix} 3 \\ 4 \end{bmatrix}$).

$c_{11} = (1 \times 3) + (2 \times 4)$

$c_{11} = 3 + 8$

$c_{11} = 11$

So, the product matrix $AB$ is $\begin{bmatrix} 11 \end{bmatrix}$.

Comparing this result with the given options, we see that it matches option (B).


The correct option is (B) $\begin{bmatrix} 11 \end{bmatrix}$.

Question 8. If $A = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}$, then $A^2$ is:

(A) $\begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}$

(B) $\begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$

(C) $\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$

(D) $\begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix}$

Answer:

We are asked to find $A^2$, which is the product of matrix $A$ with itself, i.e., $A \times A$.

Given $A = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}$.

We need to compute $A^2 = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix} \times \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}$.

The product of two $2 \times 2$ matrices results in a $2 \times 2$ matrix.

Let $A^2 = \begin{bmatrix} c_{11} & c_{12} \\ c_{21} & c_{22} \end{bmatrix}$.

To find the elements of $A^2$, we multiply the rows of the first matrix by the columns of the second matrix:

$c_{11} = (0 \times 0) + (1 \times 0) = 0 + 0 = 0$

$c_{12} = (0 \times 1) + (1 \times 0) = 0 + 0 = 0$

$c_{21} = (0 \times 0) + (0 \times 0) = 0 + 0 = 0$

$c_{22} = (0 \times 1) + (0 \times 0) = 0 + 0 = 0$

So, $A^2 = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$.

This is the null matrix of order $2 \times 2$.

Comparing this result with the given options, we see that it matches option (B).


The correct option is (B) $\begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$.

Question 9. The transpose of a column matrix is a ____ matrix.

(A) Row

(B) Column

(C) Square

(D) Diagonal

Answer:

A column matrix is a matrix that consists of a single column. Its order is typically $m \times 1$, where $m$ is the number of rows and 1 is the number of columns.

For example, a column matrix with $m$ elements can be represented as:

$A = \begin{bmatrix} a_{11} \\ a_{21} \\ \vdots \\ a_{m1} \end{bmatrix}$

The transpose of a matrix is obtained by interchanging its rows and columns. If a matrix $A$ has order $m \times n$, its transpose $A^T$ has order $n \times m$.

In the case of a column matrix $A$ of order $m \times 1$, its transpose $A^T$ will have the order $1 \times m$.

Let's find the transpose of the column matrix $A$ shown above. The single column of $A$ becomes the single row of $A^T$:

$A^T = \begin{bmatrix} a_{11} & a_{21} & \dots & a_{m1} \end{bmatrix}$

This resulting matrix $A^T$ has 1 row and $m$ columns. A matrix with only one row is called a row matrix.

Therefore, the transpose of a column matrix is a row matrix.


The correct option is (A) Row.

Question 10. If $A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}$ and $B = \begin{bmatrix} 0 & -1 \\ 1 & 5 \end{bmatrix}$, find $(AB)'$.

(A) $\begin{bmatrix} 2 & 7 \\ 4 & 23 \end{bmatrix}$

(B) $\begin{bmatrix} 2 & 4 \\ 7 & 23 \end{bmatrix}$

(C) $\begin{bmatrix} 2 & 4 \\ 23 & 7 \end{bmatrix}$

(D) $\begin{bmatrix} 4 & 2 \\ 23 & 7 \end{bmatrix}$

Answer:

We are given matrices $A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}$ and $B = \begin{bmatrix} 0 & -1 \\ 1 & 5 \end{bmatrix}$. We need to find the transpose of the matrix product $AB$, denoted by $(AB)'$ or $(AB)^T$.

First, let's calculate the matrix product $AB$. Matrix $A$ is of order $2 \times 2$ and matrix $B$ is of order $2 \times 2$. Since the number of columns in $A$ (2) is equal to the number of rows in $B$ (2), the product $AB$ is defined and will be a matrix of order $2 \times 2$.

Let $AB = C = \begin{bmatrix} c_{11} & c_{12} \\ c_{21} & c_{22} \end{bmatrix}$. The elements $c_{ij}$ are calculated as the sum of the products of the elements from the $i$-th row of $A$ and the $j$-th column of $B$.

$c_{11} = (1 \times 0) + (2 \times 1) = 0 + 2 = 2$

$c_{12} = (1 \times -1) + (2 \times 5) = -1 + 10 = 9$

$c_{21} = (3 \times 0) + (4 \times 1) = 0 + 4 = 4$

$c_{22} = (3 \times -1) + (4 \times 5) = -3 + 20 = 17$

So, the product matrix $AB$ is:

$AB = \begin{bmatrix} 2 & 9 \\ 4 & 17 \end{bmatrix}$

Next, we need to find the transpose of $AB$. The transpose of a matrix is obtained by interchanging its rows and columns.

$(AB)' = \left( \begin{bmatrix} 2 & 9 \\ 4 & 17 \end{bmatrix} \right)' = \begin{bmatrix} 2 & 4 \\ 9 & 17 \end{bmatrix}$

Alternatively, we can use the property that $(AB)' = B'A'$.

First, find the transposes of A and B:

$A' = \begin{bmatrix} 1 & 3 \\ 2 & 4 \end{bmatrix}$

$B' = \begin{bmatrix} 0 & 1 \\ -1 & 5 \end{bmatrix}$

Now, calculate the product $B'A'$:

$B'A' = \begin{bmatrix} 0 & 1 \\ -1 & 5 \end{bmatrix} \begin{bmatrix} 1 & 3 \\ 2 & 4 \end{bmatrix}$

Let $B'A' = D = \begin{bmatrix} d_{11} & d_{12} \\ d_{21} & d_{22} \end{bmatrix}$.

$d_{11} = (0 \times 1) + (1 \times 2) = 0 + 2 = 2$

$d_{12} = (0 \times 3) + (1 \times 4) = 0 + 4 = 4$

$d_{21} = (-1 \times 1) + (5 \times 2) = -1 + 10 = 9$

$d_{22} = (-1 \times 3) + (5 \times 4) = -3 + 20 = 17$

So, $B'A' = \begin{bmatrix} 2 & 4 \\ 9 & 17 \end{bmatrix}$. This confirms our previous result for $(AB)'$.

The calculated result for $(AB)'$ is $\begin{bmatrix} 2 & 4 \\ 9 & 17 \end{bmatrix}$.

Comparing this result with the given options:

(A) $\begin{bmatrix} 2 & 7 \\ 4 & 23 \end{bmatrix}$

(B) $\begin{bmatrix} 2 & 4 \\ 7 & 23 \end{bmatrix}$

(C) $\begin{bmatrix} 2 & 4 \\ 23 & 7 \end{bmatrix}$

(D) $\begin{bmatrix} 4 & 2 \\ 23 & 7 \end{bmatrix}$

Based on the given matrices $A$ and $B$, our calculated transpose of the product is $\begin{bmatrix} 2 & 4 \\ 9 & 17 \end{bmatrix}$. This matrix does not match any of the provided options.


Based on standard matrix operations, the result for $(AB)'$ with the given matrices is $\begin{bmatrix} 2 & 4 \\ 9 & 17 \end{bmatrix}$. None of the provided options match this result.

Question 11. If $A$ and $B$ are two matrices such that $AB$ is defined, then $(AB)'$ is equal to:

(A) $A'B'$

(B) $B'A'$

(C) $AB$

(D) $BA$

Answer:

This question asks about a fundamental property of matrix transposes concerning matrix multiplication.

Let $A$ be a matrix of order $m \times n$ and $B$ be a matrix of order $n \times p$. The product $AB$ is defined, and the resulting matrix $AB$ has the order $m \times p$.

The transpose of a matrix $X$ is denoted by $X'$ or $X^T$. The transpose is obtained by interchanging the rows and columns of the original matrix. If $X$ has order $m \times n$, then $X'$ has order $n \times m$.

The property of the transpose of a matrix product states that the transpose of the product of two matrices is equal to the product of their transposes in reverse order.

Symbolically, for matrices $A$ and $B$ such that $AB$ is defined, the property is:

$(AB)' = B'A'$

Let's verify the orders. If $A$ is $m \times n$, then $A'$ is $n \times m$. If $B$ is $n \times p$, then $B'$ is $p \times n$.

The order of $AB$ is $m \times p$, so the order of $(AB)'$ is $p \times m$.

The order of $B'A'$ is $(p \times n) \times (n \times m)$. Since the number of columns in $B'$ (n) equals the number of rows in $A'$ (n), the product $B'A'$ is defined, and its order is $p \times m$. The orders match.

Now consider the other options:

(A) $A'B'$: The order of $A'B'$ would be $(n \times m) \times (p \times n)$. This product is only defined if $m = p$, and the resulting order would be $n \times n$. This is generally not equal to the order of $(AB)'$, which is $p \times m$.

(C) $AB$: $(AB)'$ is equal to $AB$ only if $AB$ is a symmetric matrix (i.e., $(AB)' = AB$). This is not true for all matrices $A$ and $B$.

(D) $BA$: The product $BA$ is defined only if the number of columns in $B$ (p) equals the number of rows in $A$ (m), i.e., $p=m$. Even if $BA$ is defined, matrix multiplication is not commutative in general ($AB \neq BA$), and $(AB)'$ is generally not equal to $BA$.

Therefore, the correct property is $(AB)' = B'A'$.


The correct option is (B) $B'A'$.

Question 12. A square matrix $A$ is skew-symmetric if:

(A) $A' = A$

(B) $A' = -A$

(C) $A^2 = I$

(D) $A^2 = A$

Answer:

A square matrix $A$ is classified based on its relationship with its transpose $A'$.

A square matrix $A$ is defined as a symmetric matrix if its transpose is equal to the matrix itself, i.e., $A' = A$. For example, if $A = \begin{bmatrix} 2 & 3 \\ 3 & 5 \end{bmatrix}$, then $A' = \begin{bmatrix} 2 & 3 \\ 3 & 5 \end{bmatrix}$, so $A' = A$.

A square matrix $A$ is defined as a skew-symmetric matrix if its transpose is equal to the negative of the matrix itself, i.e., $A' = -A$. This means that for every element $a_{ij}$ of matrix $A$, the corresponding element $a_{ji}$ of its transpose $A'$ must satisfy $a_{ji} = -a_{ij}$. For the diagonal elements ($i=j$), this implies $a_{ii} = -a_{ii}$, which means $2a_{ii} = 0$, so $a_{ii} = 0$. Thus, all diagonal elements of a skew-symmetric matrix must be zero. For example, if $A = \begin{bmatrix} 0 & -3 \\ 3 & 0 \end{bmatrix}$, then $A' = \begin{bmatrix} 0 & 3 \\ -3 & 0 \end{bmatrix}$, and $-A = \begin{bmatrix} 0 & -(-3) \\ -(3) & 0 \end{bmatrix} = \begin{bmatrix} 0 & 3 \\ -3 & 0 \end{bmatrix}$. So, $A' = -A$.

Other options represent different types of matrices:

(C) $A^2 = I$: A matrix $A$ is called an involutory matrix if $A^2 = I$, where $I$ is the identity matrix.

(D) $A^2 = A$: A matrix $A$ is called an idempotent matrix if $A^2 = A$.

Based on the definitions, a square matrix $A$ is skew-symmetric if $A' = -A$.


The correct option is (B) $A' = -A$.

Question 13. Which of the following is a property of a skew-symmetric matrix $A = [a_{ij}]$?

(A) $a_{ij} = a_{ji}$ for all $i, j$

(B) $a_{ij} = 0$ for $i=j$ (diagonal elements are zero)

(C) $a_{ij} \neq 0$ for $i=j$

(D) $A^2 = I$

Answer:

A square matrix $A = [a_{ij}]$ is defined as a skew-symmetric matrix if its transpose $A'$ is equal to the negative of the matrix $A$. Mathematically, this is expressed as $A' = -A$.

The element in the $j$-th row and $i$-th column of $A'$ ($a'_{ji}$) is equal to the element in the $i$-th row and $j$-th column of $A$ ($a_{ij}$). So, $a'_{ji} = a_{ij}$.

The definition $A' = -A$ means that each element of $A'$ is the negative of the corresponding element of $A$. So, $a'_{ji} = -a_{ji}$ for all $i$ and $j$.

Combining these, we get the property of elements for a skew-symmetric matrix:

$a_{ij} = -a_{ji}$ for all $i, j$.

Now let's examine the diagonal elements of the matrix. For diagonal elements, the row index $i$ is equal to the column index $j$, i.e., $i=j$. Substituting $j=i$ into the property $a_{ij} = -a_{ji}$, we get:

$a_{ii} = -a_{ii}$

Adding $a_{ii}$ to both sides of the equation:

$a_{ii} + a_{ii} = 0$

$2a_{ii} = 0$

Dividing by 2:

$a_{ii} = 0$

This shows that for a skew-symmetric matrix, all the elements on the main diagonal must be zero.

Let's evaluate the given options:

(A) $a_{ij} = a_{ji}$ for all $i, j$: This is the property of a symmetric matrix, not a skew-symmetric matrix.

(B) $a_{ij} = 0$ for $i=j$ (diagonal elements are zero): As derived above, this is a necessary property of a skew-symmetric matrix.

(C) $a_{ij} \neq 0$ for $i=j$: This directly contradicts the property that diagonal elements must be zero.

(D) $A^2 = I$: This is the definition of an involutory matrix.

Therefore, the property that the diagonal elements are zero is a characteristic of a skew-symmetric matrix.


The correct option is (B) $a_{ij} = 0$ for $i=j$ (diagonal elements are zero).

Question 14. Any square matrix $A$ can be expressed as the sum of a symmetric matrix and a skew-symmetric matrix. The symmetric part is given by:

(A) $\frac{1}{2}(A - A')$

(B) $\frac{1}{2}(A + A')$

(C) $A + A'$

(D) $A - A'$

Answer:

Let $A$ be a square matrix. We can express $A$ as the sum of two matrices, one symmetric and one skew-symmetric.

Consider the expression $A = \frac{1}{2}(A + A') + \frac{1}{2}(A - A')$.

Let $P = \frac{1}{2}(A + A')$ and $Q = \frac{1}{2}(A - A')$. So, $A = P + Q$.

We need to check if $P$ is symmetric and $Q$ is skew-symmetric.

To check if $P$ is symmetric, we find its transpose $P'$:

$P' = \left( \frac{1}{2}(A + A') \right)'$

Using the properties $(cA)' = cA'$ and $(X+Y)' = X' + Y'$:

$P' = \frac{1}{2}(A + A')'$

$P' = \frac{1}{2}(A' + (A')')$

Using the property $(A')' = A$:

$P' = \frac{1}{2}(A' + A)$

Since matrix addition is commutative ($A' + A = A + A'$):

$P' = \frac{1}{2}(A + A')$

Thus, $P' = P$. This confirms that $P = \frac{1}{2}(A + A')$ is a symmetric matrix.


To check if $Q$ is skew-symmetric, we find its transpose $Q'$:

$Q' = \left( \frac{1}{2}(A - A') \right)'$

Using the properties $(cA)' = cA'$ and $(X-Y)' = X' - Y'$:

$Q' = \frac{1}{2}(A - A')'$

$Q' = \frac{1}{2}(A' - (A')')$

Using the property $(A')' = A$:

$Q' = \frac{1}{2}(A' - A)$

We can factor out $-1$ from the expression $(A' - A)$:

$Q' = \frac{1}{2}(-(A - A'))$

$Q' = - \frac{1}{2}(A - A')$

Thus, $Q' = -Q$. This confirms that $Q = \frac{1}{2}(A - A')$ is a skew-symmetric matrix.


So, any square matrix $A$ can be uniquely expressed as the sum of a symmetric matrix $P = \frac{1}{2}(A + A')$ and a skew-symmetric matrix $Q = \frac{1}{2}(A - A')$.

The question asks for the symmetric part, which is $P$.

$P = \frac{1}{2}(A + A')$

Comparing this with the given options, we see that it matches option (B).


The correct option is (B) $\frac{1}{2}(A + A')$.

Question 15. Which of the following is an elementary column operation?

(A) Swapping two rows.

(B) Multiplying a column by a non-zero scalar.

(C) Adding a multiple of a row to another row.

(D) Adding a constant value to each element of a column.

Answer:

Elementary operations on matrices are operations that can be performed on the rows or columns of a matrix to transform it into a simpler form, such as row echelon form or reduced row echelon form.

There are three types of elementary operations, which can be applied to either rows or columns:

Elementary Row Operations:

1. Swapping two rows ($R_i \leftrightarrow R_j$).

2. Multiplying a row by a non-zero scalar ($R_i \to cR_i$, where $c \neq 0$).

3. Adding a multiple of one row to another row ($R_i \to R_i + kR_j$, where $i \neq j$).

Elementary Column Operations:

1. Swapping two columns ($C_i \leftrightarrow C_j$).

2. Multiplying a column by a non-zero scalar ($C_i \to cC_i$, where $c \neq 0$).

3. Adding a multiple of one column to another column ($C_i \to C_i + kC_j$, where $i \neq j$).

Now let's look at the given options:

(A) Swapping two rows: This is an elementary row operation.

(B) Multiplying a column by a non-zero scalar: This matches the second type of elementary column operation.

(C) Adding a multiple of a row to another row: This is an elementary row operation.

(D) Adding a constant value to each element of a column: This is not one of the standard elementary row or column operations. Elementary operations involve linear combinations or scaling of entire rows or columns, or swapping them.

Based on the definitions, option (B) describes an elementary column operation.


The correct option is (B) Multiplying a column by a non-zero scalar.

Question 16. If $A$ is an invertible matrix, then $(A^{-1})^{-1}$ is equal to:

(A) $A$

(B) $A^{-1}$

(C) $A'$

(D) $-A$

Answer:

Let $A$ be an invertible matrix. By the definition of an invertible matrix, there exists a unique matrix $A^{-1}$ such that:

$A A^{-1} = I$

... (i)

$A^{-1} A = I$

... (ii)

where $I$ is the identity matrix of the same order as $A$.

We are asked to find the value of $(A^{-1})^{-1}$. This means we are looking for the inverse of the matrix $A^{-1}$.

By the definition of the inverse of a matrix, the inverse of $A^{-1}$ is a matrix, let's call it $X$, such that:

$(A^{-1}) X = I$

... (iii)

$X (A^{-1}) = I$

... (iv)

Comparing equation (i) with equation (iii), we have $A A^{-1} = I$ and $(A^{-1}) X = I$. If we let $X = A$, equation (iii) becomes $(A^{-1}) A = I$, which is true according to equation (ii).

Comparing equation (ii) with equation (iv), we have $A^{-1} A = I$ and $X (A^{-1}) = I$. If we let $X = A$, equation (iv) becomes $A (A^{-1}) = I$, which is true according to equation (i).

Since the inverse of a matrix is unique, the matrix $X$ that satisfies equations (iii) and (iv) must be equal to $A$.

Therefore, the inverse of $A^{-1}$ is $A$, i.e., $(A^{-1})^{-1} = A$.

This is a standard property of matrix inverses: the inverse of the inverse of an invertible matrix is the original matrix.

Comparing this result with the given options, we see that it matches option (A).


The correct option is (A) $A$.

Question 17. If $A$ and $B$ are invertible matrices of the same order, then $(B^{-1}A^{-1})^{-1}$ is equal to:

(A) $AB$

(B) $BA$

(C) $A^{-1}B^{-1}$

(D) $B^{-1}A^{-1}$

Answer:

We are given that $A$ and $B$ are invertible matrices of the same order. We need to find the value of $(B^{-1}A^{-1})^{-1}$.

We will use two important properties of matrix inverses:

Property 1: For any invertible matrix $X$, the inverse of the inverse is the matrix itself, i.e., $(X^{-1})^{-1} = X$.

Property 2: For any two invertible matrices $X$ and $Y$ of the same order, the inverse of their product is the product of their inverses in reverse order, i.e., $(XY)^{-1} = Y^{-1}X^{-1}$.


Let the matrix inside the outer inverse be $C = B^{-1}A^{-1}$. We want to find $C^{-1} = (B^{-1}A^{-1})^{-1}$.

We can apply Property 2 by letting $X = B^{-1}$ and $Y = A^{-1}$. Since $A$ and $B$ are invertible, $A^{-1}$ and $B^{-1}$ are also invertible.

Using Property 2 with $X = B^{-1}$ and $Y = A^{-1}$:

$( (B^{-1}) (A^{-1}) )^{-1} = (A^{-1})^{-1} (B^{-1})^{-1}$

Now, we apply Property 1 to the terms on the right side:

$(A^{-1})^{-1} = A$

$(B^{-1})^{-1} = B$

Substituting these back into the equation:

$(B^{-1}A^{-1})^{-1} = A B$

So, the inverse of the matrix product $B^{-1}A^{-1}$ is the matrix product $AB$.

Comparing this result with the given options, we see that it matches option (A).


The correct option is (A) $AB$.

Question 18. If a matrix $A$ is such that $AA^{-1} = I$, where $I$ is the identity matrix, then $A^{-1}$ is the ____ of $A$.

(A) Transpose

(B) Adjoint

(C) Inverse

(D) Determinant

Answer:

Let $A$ be a square matrix. A matrix $B$ of the same order as $A$ is called the inverse of $A$ if $AB = I$ and $BA = I$, where $I$ is the identity matrix.

If such a matrix $B$ exists, it is unique and is denoted by $A^{-1}$.

The given condition is $AA^{-1} = I$. This equation is part of the definition of the inverse of matrix $A$. The matrix $A^{-1}$ is the unique matrix that satisfies both $AA^{-1} = I$ and $A^{-1}A = I$.

Let's consider the other options:

(A) Transpose: The transpose of a matrix $A$, denoted by $A'$ or $A^T$, is obtained by interchanging the rows and columns of $A$. There is no general property that $AA' = I$. For example, if $A = \begin{bmatrix} 1 & 0 \\ 0 & 2 \end{bmatrix}$, $A' = \begin{bmatrix} 1 & 0 \\ 0 & 2 \end{bmatrix}$, and $AA' = \begin{bmatrix} 1 & 0 \\ 0 & 4 \end{bmatrix} \neq I$.

(B) Adjoint: The adjoint of a square matrix $A$, denoted by adj$(A)$, is the transpose of the cofactor matrix of $A$. The relationship between a matrix, its adjoint, and its inverse is given by $A (\text{adj}(A)) = (\text{adj}(A)) A = \det(A) I$. If $A$ is invertible, then $\det(A) \neq 0$, and $A^{-1} = \frac{1}{\det(A)} \text{adj}(A)$. So, the adjoint is related to the inverse, but $A^{-1}$ is not the adjoint itself unless $\det(A)=1$.

(D) Determinant: The determinant of a square matrix $A$, denoted by $\det(A)$ or $|A|$, is a scalar value associated with the matrix. It is a single number, not a matrix. Therefore, $A^{-1}$ cannot be the determinant of $A$.

The equation $AA^{-1} = I$ directly states that $A^{-1}$ is the matrix which, when multiplied by $A$, results in the identity matrix. This is the definition of the inverse matrix.


The correct option is (C) Inverse.

Question 19. Which of the following matrices is singular? (Select all that apply)

(A) $\begin{bmatrix} 2 & 1 \\ 4 & 2 \end{bmatrix}$

(B) $\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$

(C) $\begin{bmatrix} 0 & 5 \\ 0 & 3 \end{bmatrix}$

(D) $\begin{bmatrix} 3 & 1 \\ 2 & 2 \end{bmatrix}$

Answer:

A square matrix is considered singular if its determinant is equal to zero. Conversely, a square matrix is non-singular (or invertible) if its determinant is non-zero.

We need to calculate the determinant of each given $2 \times 2$ matrix. The determinant of a $2 \times 2$ matrix $\begin{bmatrix} a & b \\ c & d \end{bmatrix}$ is given by the formula $\det \begin{bmatrix} a & b \\ c & d \end{bmatrix} = ad - bc$.


Let's calculate the determinant for each option:

(A) Matrix $A = \begin{bmatrix} 2 & 1 \\ 4 & 2 \end{bmatrix}$

$\det(A) = (2 \times 2) - (1 \times 4) = 4 - 4 = 0$

Since $\det(A) = 0$, matrix (A) is singular.


(B) Matrix $B = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$

$\det(B) = (1 \times 1) - (0 \times 0) = 1 - 0 = 1$

Since $\det(B) = 1 \neq 0$, matrix (B) is non-singular.


(C) Matrix $C = \begin{bmatrix} 0 & 5 \\ 0 & 3 \end{bmatrix}$

$\det(C) = (0 \times 3) - (5 \times 0) = 0 - 0 = 0$

Since $\det(C) = 0$, matrix (C) is singular.


(D) Matrix $D = \begin{bmatrix} 3 & 1 \\ 2 & 2 \end{bmatrix}$

$\det(D) = (3 \times 2) - (1 \times 2) = 6 - 2 = 4$

Since $\det(D) = 4 \neq 0$, matrix (D) is non-singular.


The matrices with a determinant of zero are (A) and (C).


The correct options are (A) $\begin{bmatrix} 2 & 1 \\ 4 & 2 \end{bmatrix}$ and (C) $\begin{bmatrix} 0 & 5 \\ 0 & 3 \end{bmatrix}$.

Question 20. Assertion (A): If $A$ is a square matrix, then $A+A'$ is always symmetric.

Reason (R): $(A+A')' = A' + (A')' = A' + A = A+A'$.

(A) Both A and R are true and R is the correct explanation of A.

(B) Both A and R are true but R is not the correct explanation of A.

(C) A is true but R is false.

(D) A is false but R is true.

Answer:

Let $A$ be a square matrix.


Assertion (A): If $A$ is a square matrix, then $A+A'$ is always symmetric.

A matrix $M$ is symmetric if its transpose $M'$ is equal to $M$. We need to check if the matrix $M = A+A'$ is symmetric, i.e., if $(A+A')' = A+A'$.

Using the properties of matrix transpose:

The transpose of a sum of matrices is the sum of their transposes: $(X+Y)' = X' + Y'$.

The transpose of the transpose of a matrix is the original matrix: $(X')' = X$.

Applying these properties to $(A+A')'$:

$(A+A')' = A' + (A')'$

$(A+A')' = A' + A$

Since matrix addition is commutative ($A' + A = A + A'$):

$(A+A')' = A + A'$

Since $(A+A')' = A+A'$, the matrix $A+A'$ is symmetric.

Therefore, the Assertion (A) is True.


Reason (R): $(A+A')' = A' + (A')' = A' + A = A+A'$.

This statement provides the steps for calculating the transpose of $A+A'$. The steps correctly use the properties of transpose: $(A+A')' = A' + (A')'$ (transpose of sum is sum of transposes) and $(A')' = A$. It also correctly uses the commutative property of matrix addition: $A' + A = A + A'$.

Therefore, the Reason (R) is a correct mathematical derivation and is True.


Relationship between Assertion and Reason:

The Reason shows that the transpose of the matrix $(A+A')$ is equal to $(A+A')$ itself. This calculation directly proves that the matrix $(A+A')$ satisfies the condition for being a symmetric matrix.

Therefore, the Reason (R) correctly explains why the Assertion (A) is true.


Based on the analysis, both Assertion and Reason are true, and the Reason is the correct explanation for the Assertion.


The correct option is (A) Both A and R are true and R is the correct explanation of A.

Question 21. If $A$ is a $3 \times 4$ matrix, what is the order of $A'$?

(A) $3 \times 4$

(B) $4 \times 3$

(C) $4 \times 4$

(D) $3 \times 3$

Answer:

The order of a matrix is given by the number of rows $\times$ the number of columns.

We are given that matrix $A$ is a $3 \times 4$ matrix. This means that matrix $A$ has 3 rows and 4 columns.

The transpose of a matrix, denoted by $A'$ or $A^T$, is obtained by interchanging the rows and columns of the original matrix.

If a matrix $A$ has order $m \times n$ (meaning $m$ rows and $n$ columns), then its transpose $A'$ will have order $n \times m$ (meaning $n$ rows and $m$ columns).

In this question, the order of matrix $A$ is $3 \times 4$. Here, $m = 3$ and $n = 4$.

According to the rule for transposing matrices, the order of $A'$ will be $n \times m$.

So, the order of $A'$ is $4 \times 3$. This means $A'$ will have 4 rows and 3 columns.

Comparing this result with the given options, we see that it matches option (B).


The correct option is (B) $4 \times 3$.

Question 22. If $A = \begin{bmatrix} 2 & 0 \\ 1 & 3 \end{bmatrix}$, find $A^2 - 5A + 6I$, where $I = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$.

(A) $\begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$

(B) $\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$

(C) $\begin{bmatrix} 4 & 0 \\ 2 & 9 \end{bmatrix}$

(D) $\begin{bmatrix} 10 & 0 \\ 5 & 15 \end{bmatrix}$

Answer:

We are given the matrix $A = \begin{bmatrix} 2 & 0 \\ 1 & 3 \end{bmatrix}$ and the identity matrix $I = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$. We need to calculate the value of the expression $A^2 - 5A + 6I$.


First, let's calculate $A^2 = A \times A$:

$A^2 = \begin{bmatrix} 2 & 0 \\ 1 & 3 \end{bmatrix} \begin{bmatrix} 2 & 0 \\ 1 & 3 \end{bmatrix}$

$A^2 = \begin{bmatrix} (2 \times 2) + (0 \times 1) & (2 \times 0) + (0 \times 3) \\ (1 \times 2) + (3 \times 1) & (1 \times 0) + (3 \times 3) \end{bmatrix}$

$A^2 = \begin{bmatrix} 4 + 0 & 0 + 0 \\ 2 + 3 & 0 + 9 \end{bmatrix}$

$A^2 = \begin{bmatrix} 4 & 0 \\ 5 & 9 \end{bmatrix}$


Next, let's calculate $5A$ (scalar multiplication):

$5A = 5 \times \begin{bmatrix} 2 & 0 \\ 1 & 3 \end{bmatrix}$

$5A = \begin{bmatrix} 5 \times 2 & 5 \times 0 \\ 5 \times 1 & 5 \times 3 \end{bmatrix}$

$5A = \begin{bmatrix} 10 & 0 \\ 5 & 15 \end{bmatrix}$


Now, let's calculate $6I$ (scalar multiplication):

$6I = 6 \times \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$

$6I = \begin{bmatrix} 6 \times 1 & 6 \times 0 \\ 6 \times 0 & 6 \times 1 \end{bmatrix}$

$6I = \begin{bmatrix} 6 & 0 \\ 0 & 6 \end{bmatrix}$


Finally, we calculate $A^2 - 5A + 6I$:

$A^2 - 5A + 6I = \begin{bmatrix} 4 & 0 \\ 5 & 9 \end{bmatrix} - \begin{bmatrix} 10 & 0 \\ 5 & 15 \end{bmatrix} + \begin{bmatrix} 6 & 0 \\ 0 & 6 \end{bmatrix}$

First, perform the subtraction $A^2 - 5A$:

$\begin{bmatrix} 4 & 0 \\ 5 & 9 \end{bmatrix} - \begin{bmatrix} 10 & 0 \\ 5 & 15 \end{bmatrix} = \begin{bmatrix} 4 - 10 & 0 - 0 \\ 5 - 5 & 9 - 15 \end{bmatrix} = \begin{bmatrix} -6 & 0 \\ 0 & -6 \end{bmatrix}$

Now, add $6I$ to the result:

$\begin{bmatrix} -6 & 0 \\ 0 & -6 \end{bmatrix} + \begin{bmatrix} 6 & 0 \\ 0 & 6 \end{bmatrix} = \begin{bmatrix} -6 + 6 & 0 + 0 \\ 0 + 0 & -6 + 6 \end{bmatrix} = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$

The result is the null matrix of order $2 \times 2$.


This calculation is an illustration of the Cayley-Hamilton theorem, which states that every square matrix satisfies its own characteristic equation. For matrix $A = \begin{bmatrix} 2 & 0 \\ 1 & 3 \end{bmatrix}$, the characteristic equation is $\det(A - \lambda I) = 0$.

$\det \begin{bmatrix} 2-\lambda & 0 \\ 1 & 3-\lambda \end{bmatrix} = (2-\lambda)(3-\lambda) - (0)(1) = 6 - 2\lambda - 3\lambda + \lambda^2 = \lambda^2 - 5\lambda + 6$

The characteristic equation is $\lambda^2 - 5\lambda + 6 = 0$. According to the Cayley-Hamilton theorem, substituting the matrix $A$ for $\lambda$ and the identity matrix $I$ for the constant term gives $A^2 - 5A + 6I = O$, where $O$ is the null matrix.


The result of the expression $A^2 - 5A + 6I$ is $\begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$.

Comparing this result with the given options, we see that it matches option (A).


The correct option is (A) $\begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$.

Question 23. Match the matrices in Column I with their type in Column II.

(i) $\begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$

(ii) $\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$

(iii) $\begin{bmatrix} 5 & 0 \\ 0 & -2 \end{bmatrix}$

(iv) $\begin{bmatrix} 3 & 0 \\ 0 & 3 \end{bmatrix}$

(a) Diagonal matrix (not scalar)

(b) Identity matrix

(c) Zero matrix

(d) Scalar matrix (not identity)

Answer:

Let's identify the type of each matrix in Column I based on the definitions of matrix types.


Matrix (i) is $\begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$. This is a square matrix where all elements are zero. This is the definition of a Zero matrix.

Matching: (i) - (c)


Matrix (ii) is $\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$. This is a square matrix where all diagonal elements are 1 and all non-diagonal elements are 0. This is the definition of an Identity matrix.

Matching: (ii) - (b)


Matrix (iii) is $\begin{bmatrix} 5 & 0 \\ 0 & -2 \end{bmatrix}$. This is a square matrix where all non-diagonal elements are zero. The diagonal elements are $5$ and $-2$. Since the diagonal elements are not all equal, it is a Diagonal matrix, but not a scalar matrix or an identity matrix. The description "(a) Diagonal matrix (not scalar)" fits.

Matching: (iii) - (a)


Matrix (iv) is $\begin{bmatrix} 3 & 0 \\ 0 & 3 \end{bmatrix}$. This is a square matrix where all non-diagonal elements are zero, and all diagonal elements are equal to a non-zero constant (3). This is the definition of a Scalar matrix. Since the diagonal elements are 3 (not 1), it is not an identity matrix. The description "(d) Scalar matrix (not identity)" fits.

Matching: (iv) - (d)


Summary of the matching:

(i) - (c)

(ii) - (b)

(iii) - (a)

(iv) - (d)

Question 24. If $A = \begin{bmatrix} x & 5 \\ 2 & 3 \end{bmatrix}$ is equal to $B = \begin{bmatrix} 4 & 5 \\ 2 & y \end{bmatrix}$, then the values of $x$ and $y$ are:

(A) $x=4, y=3$

(B) $x=3, y=4$

(C) $x=2, y=5$

(D) $x=5, y=2$

Answer:

Two matrices are equal if and only if they have the same order and their corresponding elements are equal.

Given the matrices $A = \begin{bmatrix} x & 5 \\ 2 & 3 \end{bmatrix}$ and $B = \begin{bmatrix} 4 & 5 \\ 2 & y \end{bmatrix}$.

Both matrices are of order $2 \times 2$, so they have the same order.

For the matrices to be equal, their corresponding elements must be equal. This means:

The element in the first row, first column of $A$ must be equal to the element in the first row, first column of $B$.

$a_{11} = b_{11} \implies x = 4$

The element in the first row, second column of $A$ must be equal to the element in the first row, second column of $B$.

$a_{12} = b_{12} \implies 5 = 5$ (This is already true and doesn't help in finding $x$ or $y$)

The element in the second row, first column of $A$ must be equal to the element in the second row, first column of $B$.

$a_{21} = b_{21} \implies 2 = 2$ (This is also already true and doesn't help in finding $x$ or $y$)

The element in the second row, second column of $A$ must be equal to the element in the second row, second column of $B$.

$a_{22} = b_{22} \implies 3 = y$

From the equality of the corresponding elements, we get $x = 4$ and $y = 3$.

Comparing these values with the given options, we see that they match option (A).


The correct option is (A) $x=4, y=3$.

Question 25. If $A$ is a square matrix of order $n$, then $A \cdot adj(A) = \dots$

(A) $I$

(B) $O$

(C) $|A| I$

(D) $|A| A^{-1}$

Answer:

Let $A$ be a square matrix of order $n$. The adjoint of a square matrix $A$, denoted as adj$(A)$, is the transpose of the cofactor matrix of $A$.

A fundamental property relating a square matrix $A$, its adjoint adj$(A)$, its determinant $|A|$ (or $\det(A)$), and the identity matrix $I$ of the same order $n$ is given by the following equation:

$A \cdot \text{adj}(A) = \text{adj}(A) \cdot A = |A| I$

Here, $I$ is the identity matrix of order $n$, and $|A|$ is the determinant of matrix $A$, which is a scalar value.

This property holds true for any square matrix, regardless of whether it is invertible or singular.

Let's examine the given options:

(A) $I$: This is true only if $|A|=1$. This is not always the case for any square matrix $A$.

(B) $O$: This is the null matrix. This is true only if $|A|=0$, which means the matrix $A$ is singular. This is not always the case for any square matrix $A$.

(C) $|A| I$: This matches the fundamental property $A \cdot \text{adj}(A) = |A| I$.

(D) $|A| A^{-1}$: The inverse of an invertible matrix $A$ is given by $A^{-1} = \frac{1}{|A|} \text{adj}(A)$, provided $|A| \neq 0$. Multiplying both sides by $|A|$ gives $|A| A^{-1} = \text{adj}(A)$. So, option (D) is equivalent to adj$(A)$. The equation $A \cdot \text{adj}(A) = \text{adj}(A)$ is not true in general for any matrix $A$.

Therefore, the correct expression for $A \cdot \text{adj}(A)$ is $|A| I$.


The correct option is (C) $|A| I$.

Question 26. Which of the following is NOT a property of matrix multiplication?

(A) Associative: $(AB)C = A(BC)$

(B) Commutative: $AB = BA$

(C) Distributive: $A(B+C) = AB+AC$

(D) Existence of Multiplicative Identity (for square matrices)

Answer:

Let's examine each option to determine whether it is a general property of matrix multiplication.


(A) Associative: $(AB)C = A(BC)$

If the matrix products $AB$, $BC$, $(AB)C$, and $A(BC)$ are defined, then matrix multiplication is associative. This means the grouping of matrices in a product does not affect the result. For example, if $A$ is $m \times n$, $B$ is $n \times p$, and $C$ is $p \times q$, then $AB$ is $m \times p$, $BC$ is $n \times q$, $(AB)C$ is $m \times q$, and $A(BC)$ is $m \times q$. The dimensions are compatible, and the equality $(AB)C = A(BC)$ holds.

This property holds for matrix multiplication.


(B) Commutative: $AB = BA$

Matrix multiplication is generally not commutative. This means that for two matrices $A$ and $B$, the product $AB$ is not necessarily equal to $BA$.

Several situations illustrate this lack of commutativity:

1. The product $AB$ might be defined, but $BA$ might not be defined (e.g., if $A$ is $2 \times 3$ and $B$ is $3 \times 4$, $AB$ is $2 \times 4$, but $BA$ is not defined).

2. Both $AB$ and $BA$ might be defined, but they may have different orders (e.g., if $A$ is $2 \times 3$ and $B$ is $3 \times 2$, $AB$ is $2 \times 2$ and $BA$ is $3 \times 3$).

3. Both $AB$ and $BA$ might be defined and have the same order (which happens if $A$ and $B$ are square matrices of the same order), but the resulting matrices $AB$ and $BA$ are generally not equal.

For example, let $A = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}$ and $B = \begin{bmatrix} 1 & 0 \\ 1 & 1 \end{bmatrix}$.

$AB = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 1 & 0 \\ 1 & 1 \end{bmatrix} = \begin{bmatrix} (1 \times 1 + 1 \times 1) & (1 \times 0 + 1 \times 1) \\ (0 \times 1 + 1 \times 1) & (0 \times 0 + 1 \times 1) \end{bmatrix} = \begin{bmatrix} 2 & 1 \\ 1 & 1 \end{bmatrix}$

$BA = \begin{bmatrix} 1 & 0 \\ 1 & 1 \end{bmatrix} \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix} = \begin{bmatrix} (1 \times 1 + 0 \times 0) & (1 \times 1 + 0 \times 1) \\ (1 \times 1 + 1 \times 0) & (1 \times 1 + 1 \times 1) \end{bmatrix} = \begin{bmatrix} 1 & 1 \\ 1 & 2 \end{bmatrix}$

In this case, $AB \neq BA$. Thus, the commutative property does not hold for matrix multiplication in general.


(C) Distributive: $A(B+C) = AB+AC$

Matrix multiplication is distributive over matrix addition, provided the matrix sizes are compatible for all operations. Specifically, if $A$ is $m \times n$ and $B$ and $C$ are $n \times p$ (so that $B+C$ is defined and is $n \times p$, and $AB$ and $AC$ are defined and are $m \times p$), then the equality $A(B+C) = AB+AC$ holds. Similarly, right distribution $(A+B)C = AC+BC$ also holds when defined.

This property holds for matrix multiplication.


(D) Existence of Multiplicative Identity (for square matrices)

For any square matrix $A$ of order $n$, there exists a unique identity matrix $I_n$ of the same order such that $A I_n = I_n A = A$. This property is crucial in matrix algebra, similar to how the number 1 acts as the multiplicative identity for real numbers.

This property holds for square matrix multiplication.


Based on the analysis, the property that is NOT a general property of matrix multiplication is commutativity.


The correct option is (B) Commutative: $AB = BA$.

Question 27. If $A = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}$, then $A^n$ for a positive integer $n$ is:

(A) $\begin{bmatrix} 1 & 1^n \\ 0 & 1^n \end{bmatrix}$

(B) $\begin{bmatrix} 1 & n \\ 0 & 1 \end{bmatrix}$

(C) $\begin{bmatrix} 1^n & n \\ 0 & 1^n \end{bmatrix}$

(D) $\begin{bmatrix} n & n \\ 0 & n \end{bmatrix}$

Answer:

We are asked to find the formula for the $n$-th power of the matrix $A = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}$ for a positive integer $n$.

Let's compute the first few powers of $A$ to identify a pattern.

For $n=1$:

$A^1 = A = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}$

For $n=2$:

$A^2 = A \times A = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix} = \begin{bmatrix} (1 \times 1) + (1 \times 0) & (1 \times 1) + (1 \times 1) \\ (0 \times 1) + (1 \times 0) & (0 \times 1) + (1 \times 1) \end{bmatrix} = \begin{bmatrix} 1 + 0 & 1 + 1 \\ 0 + 0 & 0 + 1 \end{bmatrix} = \begin{bmatrix} 1 & 2 \\ 0 & 1 \end{bmatrix}$

For $n=3$:

$A^3 = A^2 \times A = \begin{bmatrix} 1 & 2 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix} = \begin{bmatrix} (1 \times 1) + (2 \times 0) & (1 \times 1) + (2 \times 1) \\ (0 \times 1) + (1 \times 0) & (0 \times 1) + (1 \times 1) \end{bmatrix} = \begin{bmatrix} 1 + 0 & 1 + 2 \\ 0 + 0 & 0 + 1 \end{bmatrix} = \begin{bmatrix} 1 & 3 \\ 0 & 1 \end{bmatrix}$

For $n=4$:

$A^4 = A^3 \times A = \begin{bmatrix} 1 & 3 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix} = \begin{bmatrix} (1 \times 1) + (3 \times 0) & (1 \times 1) + (3 \times 1) \\ (0 \times 1) + (1 \times 0) & (0 \times 1) + (1 \times 1) \end{bmatrix} = \begin{bmatrix} 1 + 0 & 1 + 3 \\ 0 + 0 & 0 + 1 \end{bmatrix} = \begin{bmatrix} 1 & 4 \\ 0 & 1 \end{bmatrix}$

Observing the pattern, it appears that for a positive integer $n$, the matrix $A^n$ is given by $\begin{bmatrix} 1 & n \\ 0 & 1 \end{bmatrix}$.

Let's compare this pattern with the given options:

(A) $\begin{bmatrix} 1 & 1^n \\ 0 & 1^n \end{bmatrix}$. Since $1^n = 1$ for any positive integer $n$, this simplifies to $\begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}$, which is only equal to $A^n$ when $n=1$.

(B) $\begin{bmatrix} 1 & n \\ 0 & 1 \end{bmatrix}$. This directly matches the pattern we observed.

(C) $\begin{bmatrix} 1^n & n \\ 0 & 1^n \end{bmatrix}$. Since $1^n = 1$ for any positive integer $n$, this simplifies to $\begin{bmatrix} 1 & n \\ 0 & 1 \end{bmatrix}$, which is the same as option (B) and matches the pattern.

(D) $\begin{bmatrix} n & n \\ 0 & n \end{bmatrix}$. For $n=1$, this is $\begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}$ (matches $A^1$). For $n=2$, this is $\begin{bmatrix} 2 & 2 \\ 0 & 2 \end{bmatrix}$, which does not match $A^2 = \begin{bmatrix} 1 & 2 \\ 0 & 1 \end{bmatrix}$.

Both options (B) and (C) simplify to the same matrix $\begin{bmatrix} 1 & n \\ 0 & 1 \end{bmatrix}$, which correctly represents $A^n$ for a positive integer $n$. However, option (B) presents the formula in its simplest form directly reflecting the pattern observed. Option (C) is also correct as $1^n=1$. In a multiple choice question with identical correct options, there might be an issue with the question design. Assuming the intent is a unique answer from the options provided, and recognizing that the simplest form is usually preferred, option (B) is the most direct match to the derived pattern.


The correct option is (B) $\begin{bmatrix} 1 & n \\ 0 & 1 \end{bmatrix}$ (Option (C) is equivalent). We choose (B) as the most direct representation of the pattern.

Question 28. If $A$ and $B$ are symmetric matrices of the same order, then $AB$ is symmetric if and only if:

(A) $AB = BA$

(B) $AB = -BA$

(C) $AB = O$

(D) $A+B = I$

Answer:

We are given that $A$ and $B$ are symmetric matrices of the same order. By definition, a matrix $M$ is symmetric if its transpose $M'$ is equal to the matrix itself, i.e., $M' = M$.

Since $A$ is symmetric, we have:

$A' = A$

(Given that A is symmetric)

Since $B$ is symmetric, we have:

$B' = B$

(Given that B is symmetric)


We want to find the condition under which the product $AB$ is symmetric. The product $AB$ is symmetric if its transpose $(AB)'$ is equal to $AB$.

So, we need to find when $(AB)' = AB$.

We use the property of the transpose of a matrix product, which states that $(XY)' = Y'X'$ for any matrices $X$ and $Y$ for which the product $XY$ is defined.

Applying this property to $(AB)'$, we have:

$(AB)' = B'A'$

Now, substitute the conditions $A' = A$ and $B' = B$ (since $A$ and $B$ are symmetric) into the expression for $(AB)'$:

$(AB)' = (B)(A)$

$(AB)' = BA$

For the matrix $AB$ to be symmetric, we must have $(AB)' = AB$.

Substituting the result $(AB)' = BA$, the condition for $AB$ to be symmetric becomes:

$BA = AB$

Thus, if $A$ and $B$ are symmetric matrices of the same order, their product $AB$ is symmetric if and only if $A$ and $B$ commute with respect to matrix multiplication.


Let's examine the given options:

(A) $AB = BA$: This is the condition we derived for $AB$ to be symmetric.

(B) $AB = -BA$: This condition would imply $(AB)' = -AB$, meaning $AB$ is skew-symmetric.

(C) $AB = O$: If $AB$ is the zero matrix, then $AB$ is symmetric (since $O' = O$). However, $AB=O$ is a sufficient condition for $AB$ to be symmetric, but not a necessary one (i.e., $AB$ can be symmetric even if $AB \neq O$, as shown in the thought process). The question asks for the condition "if and only if".

(D) $A+B = I$: This is a condition on the sum of $A$ and $B$, which does not generally imply that $AB$ is symmetric.

The condition $AB = BA$ is both necessary and sufficient for the product of two symmetric matrices $A$ and $B$ to be symmetric.


The correct option is (A) $AB = BA$.

Question 29. The matrix $\begin{bmatrix} 0 & -5 & 8 \\ 5 & 0 & 12 \\ -8 & -12 & 0 \end{bmatrix}$ is a:

(A) Symmetric matrix

(B) Skew-symmetric matrix

(C) Diagonal matrix

(D) Identity matrix

Answer:

Let the given matrix be $A = \begin{bmatrix} 0 & -5 & 8 \\ 5 & 0 & 12 \\ -8 & -12 & 0 \end{bmatrix}$.

We need to determine the type of this matrix by checking its properties.


Definition of Symmetric Matrix: A square matrix $A$ is symmetric if $A' = A$, meaning $a_{ij} = a_{ji}$ for all $i, j$.

Let's find the transpose of $A$ by interchanging its rows and columns:

$A' = \begin{bmatrix} 0 & 5 & -8 \\ -5 & 0 & -12 \\ 8 & 12 & 0 \end{bmatrix}$

Comparing $A'$ with $A$: $A' \neq A$ (e.g., the element in the first row, second column of $A'$ is 5, while in $A$ it is -5).

So, the matrix is not symmetric.


Definition of Skew-symmetric Matrix: A square matrix $A$ is skew-symmetric if $A' = -A$, meaning $a_{ij} = -a_{ji}$ for all $i, j$. For $i=j$, this implies $a_{ii} = -a_{ii}$, so $a_{ii} = 0$. The diagonal elements must be zero.

Let's find the negative of $A$ by multiplying each element by -1:

$-A = - \begin{bmatrix} 0 & -5 & 8 \\ 5 & 0 & 12 \\ -8 & -12 & 0 \end{bmatrix} = \begin{bmatrix} -(0) & -(-5) & -(8) \\ -(5) & -(0) & -(12) \\ -(-8) & -(-12) & -(0) \end{bmatrix} = \begin{bmatrix} 0 & 5 & -8 \\ -5 & 0 & -12 \\ 8 & 12 & 0 \end{bmatrix}$

Comparing $A'$ with $-A$, we see that they are equal:

$A' = \begin{bmatrix} 0 & 5 & -8 \\ -5 & 0 & -12 \\ 8 & 12 & 0 \end{bmatrix}$ and $-A = \begin{bmatrix} 0 & 5 & -8 \\ -5 & 0 & -12 \\ 8 & 12 & 0 \end{bmatrix}$

Since $A' = -A$, the matrix is skew-symmetric.

Also, observe that the diagonal elements are all zero (0, 0, 0), which is a necessary condition for a skew-symmetric matrix.


Definition of Diagonal Matrix: A square matrix is diagonal if all its non-diagonal elements are zero ($a_{ij} = 0$ for $i \neq j$).

In the given matrix $A$, the non-diagonal elements are -5, 8, 5, 12, -8, -12. Since some of these are non-zero, the matrix is not diagonal.


Definition of Identity Matrix: An identity matrix is a diagonal matrix where all diagonal elements are 1. It is typically denoted by $I$ or $I_n$ (for order $n$).

The given matrix is not a diagonal matrix, and its diagonal elements are 0, not 1. So, it is not an identity matrix.


Based on the analysis, the given matrix is a skew-symmetric matrix.


The correct option is (B) Skew-symmetric matrix.

Question 30. Assertion (A): If $A$ is an invertible matrix, then $A'$ is also invertible.

Reason (R): $(A')^{-1} = (A^{-1})'$.

(A) Both A and R are true and R is the correct explanation of A.

(B) Both A and R are true but R is not the correct explanation of A.

(C) A is true but R is false.

(D) A is false but R is true.

Answer:

Let's analyze the Assertion and the Reason separately.


Assertion (A): If $A$ is an invertible matrix, then $A'$ is also invertible.

A square matrix $A$ is invertible if there exists a matrix $A^{-1}$ such that $AA^{-1} = A^{-1}A = I$, where $I$ is the identity matrix. A key property related to invertibility is that a square matrix $A$ is invertible if and only if its determinant is non-zero, i.e., $|A| \neq 0$.

We know the property that the determinant of the transpose of a matrix is equal to the determinant of the original matrix: $|A'| = |A|$.

If $A$ is invertible, then $|A| \neq 0$.

Since $|A'| = |A|$, if $|A| \neq 0$, it follows that $|A'| \neq 0$.

If $|A'| \neq 0$, then the matrix $A'$ is also invertible.

Therefore, the Assertion (A) is True.


Reason (R): $(A')^{-1} = (A^{-1})'$.

This statement claims a relationship between the inverse of the transpose of a matrix and the transpose of the inverse of the matrix. This is a known property of matrix operations.

Let $A$ be an invertible matrix. By definition, $A^{-1}$ exists and satisfies $AA^{-1} = I$ and $A^{-1}A = I$.

To show that $(A^{-1})'$ is the inverse of $A'$, we need to verify if the product of $A'$ and $(A^{-1})'$ (in both orders) is the identity matrix $I$.

Using the property $(XY)' = Y'X'$, we consider the product $A' (A^{-1})'$:

$A' (A^{-1})' = (A^{-1} A)'$

Since $A^{-1} A = I$:

$A' (A^{-1})' = I'$

The transpose of an identity matrix is the identity matrix itself ($I' = I$):

$A' (A^{-1})' = I$

Now, consider the product $(A^{-1})' A'$:

$(A^{-1})' A' = (A A^{-1})'$

Since $A A^{-1} = I$:

$(A^{-1})' A' = I'$

$(A^{-1})' A' = I$

Since multiplying $A'$ by $(A^{-1})'$ (in both orders) results in the identity matrix $I$, by the definition of the inverse matrix, $(A^{-1})'$ is indeed the inverse of $A'$.

Thus, $(A')^{-1} = (A^{-1})'$.

Therefore, the Reason (R) is True.


Relationship between Assertion and Reason:

The Assertion states that if $A$ is invertible, $A'$ is also invertible. The Reason provides the explicit formula for the inverse of $A'$ as $(A^{-1})'$. The existence of $(A^{-1})'$ implies the existence of $(A')^{-1}$, which in turn means $A'$ is invertible. Therefore, the Reason directly explains why the Assertion is true by providing the structure of the inverse of $A'$.


Based on the analysis, both the Assertion and the Reason are true, and the Reason provides the correct explanation for the Assertion.


The correct option is (A) Both A and R are true and R is the correct explanation of A.

Question 31. If an elementary row operation $R_i \leftrightarrow R_j$ is applied to an identity matrix $I$, the resulting matrix is called a/an ____ matrix.

(A) Identity

(B) Elementary

(C) Inverse

(D) Singular

Answer:

An elementary matrix is a matrix that is obtained by performing a single elementary row operation or a single elementary column operation on an identity matrix.

The three types of elementary row operations are:

1. Swapping two rows ($R_i \leftrightarrow R_j$).

2. Multiplying a row by a non-zero scalar ($R_i \to cR_i$, where $c \neq 0$).

3. Adding a multiple of one row to another row ($R_i \to R_i + kR_j$, where $i \neq j$).

Similarly, there are three types of elementary column operations.

The question describes applying the elementary row operation $R_i \leftrightarrow R_j$ (swapping two rows) to an identity matrix $I$. According to the definition, the matrix resulting from performing a single elementary operation on an identity matrix is called an elementary matrix.

For example, if we apply the operation $R_1 \leftrightarrow R_2$ to the $3 \times 3$ identity matrix $I_3 = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}$, we get the matrix $\begin{bmatrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 1 \end{bmatrix}$. This is an elementary matrix.

Let's consider the other options:

(A) Identity matrix: Applying an elementary operation to an identity matrix does not always result in the identity matrix. For example, swapping two rows changes the matrix unless the rows are identical (which is only possible if the matrix has only one row). Multiplying a row by a scalar other than 1 (or -1) will also change the identity matrix.

(C) Inverse matrix: While elementary matrices are invertible, the term "inverse matrix" refers to the inverse of a given matrix, not the result of an elementary operation on the identity matrix.

(D) Singular matrix: A singular matrix is a matrix whose determinant is zero. Elementary row operations preserve the singularity or non-singularity of a matrix. The identity matrix is non-singular (its determinant is 1). An elementary matrix formed by swapping rows or adding multiples of rows has a non-zero determinant. An elementary matrix formed by scaling a row by a non-zero scalar $c$ has a determinant equal to $c$, which is non-zero. Thus, elementary matrices are always non-singular. So the resulting matrix is not singular.

Therefore, the resulting matrix is called an elementary matrix.


The correct option is (B) Elementary.

Question 32. If $A$ and $B$ are square matrices of the same order, which of the following is correct?

(A) $(A+B)^2 = A^2 + 2AB + B^2$

(B) $(A-B)^2 = A^2 - 2AB + B^2$

(C) $(A+B)(A-B) = A^2 - B^2$

(D) $AB \neq BA$ in general

Answer:

Let $A$ and $B$ be square matrices of the same order. We need to evaluate the correctness of each given statement regarding matrix operations.


(A) $(A+B)^2 = A^2 + 2AB + B^2$

We expand the left side using the distributive property of matrix multiplication:

$(A+B)^2 = (A+B)(A+B) = A(A+B) + B(A+B)$

Applying distribution again:

$(A+B)^2 = AA + AB + BA + BB = A^2 + AB + BA + B^2$

For this to be equal to $A^2 + 2AB + B^2$, we would need $A^2 + AB + BA + B^2 = A^2 + 2AB + B^2$.

Subtracting $A^2$ and $B^2$ from both sides gives $AB + BA = 2AB$.

Subtracting $AB$ from both sides gives $BA = AB$.

This equality, $BA = AB$, means that matrices $A$ and $B$ commute under multiplication. Matrix multiplication is generally not commutative, i.e., $AB \neq BA$ in general. Therefore, the statement $(A+B)^2 = A^2 + 2AB + B^2$ is generally false for matrices.


(B) $(A-B)^2 = A^2 - 2AB + B^2$

We expand the left side using the distributive property:

$(A-B)^2 = (A-B)(A-B) = A(A-B) - B(A-B)$

Applying distribution again:

$(A-B)^2 = AA - AB - BA + BB = A^2 - AB - BA + B^2$

For this to be equal to $A^2 - 2AB + B^2$, we would need $A^2 - AB - BA + B^2 = A^2 - 2AB + B^2$.

Subtracting $A^2$ and $B^2$ from both sides gives $-AB - BA = -2AB$.

Adding $AB$ to both sides gives $-BA = -AB$, or $BA = AB$.

Again, this requires matrices $A$ and $B$ to commute. Since matrix multiplication is generally not commutative, the statement $(A-B)^2 = A^2 - 2AB + B^2$ is generally false for matrices.


(C) $(A+B)(A-B) = A^2 - B^2$

We expand the left side using the distributive property:

$(A+B)(A-B) = A(A-B) + B(A-B)$

Applying distribution again:

$(A+B)(A-B) = AA - AB + BA - BB = A^2 - AB + BA - B^2$

For this to be equal to $A^2 - B^2$, we would need $A^2 - AB + BA - B^2 = A^2 - B^2$.

Subtracting $A^2$ and adding $B^2$ to both sides gives $-AB + BA = O$ (the zero matrix).

Adding $AB$ to both sides gives $BA = AB$.

Again, this requires matrices $A$ and $B$ to commute. Since matrix multiplication is generally not commutative, the statement $(A+B)(A-B) = A^2 - B^2$ is generally false for matrices.


(D) $AB \neq BA$ in general

This statement asserts that matrix multiplication is not commutative in general. As shown in the analysis of options (A), (B), and (C), the order of multiplication matters for matrices, and $AB$ is not equal to $BA$ for arbitrary matrices $A$ and $B$. There are specific cases where $AB=BA$ (e.g., if $A$ and $B$ are diagonal matrices, or one is the identity matrix, etc.), but this property does not hold true for all pairs of square matrices. The phrase "in general" correctly reflects this fact.

This statement is a fundamental property of matrix algebra.

Therefore, the statement $AB \neq BA$ in general is correct.


The correct option is (D) $AB \neq BA$ in general.

Question 33. If $A$ is an invertible matrix, then which of the following is NOT necessarily true?

(A) $A$ is a square matrix.

(B) $|A| \neq 0$.

(C) $A^{-1}$ is also invertible.

(D) $A$ is symmetric.

Answer:

Let $A$ be an invertible matrix. We need to identify the statement that is not necessarily true about $A$.


(A) $A$ is a square matrix.

The definition of an invertible matrix applies only to square matrices. For a matrix $A$ to have an inverse $A^{-1}$ such that $AA^{-1} = A^{-1}A = I$, the products $AA^{-1}$ and $A^{-1}A$ must be defined and result in an identity matrix. This is only possible if $A$ and $A^{-1}$ are square matrices of the same order.

Therefore, if $A$ is an invertible matrix, it must be a square matrix. This statement is necessarily true.


(B) $|A| \neq 0$.

A square matrix $A$ is invertible if and only if its determinant $|A|$ is non-zero. This is a fundamental theorem in linear algebra.

Therefore, if $A$ is an invertible matrix, its determinant must be non-zero. This statement is necessarily true.


(C) $A^{-1}$ is also invertible.

If $A$ is an invertible matrix with inverse $A^{-1}$, then by the definition of the inverse, $A^{-1}A = AA^{-1} = I$. This shows that the matrix $A$ acts as the inverse of $A^{-1}$. Thus, $A^{-1}$ is also an invertible matrix, and its inverse is $A$, i.e., $(A^{-1})^{-1} = A$.

Therefore, if $A$ is an invertible matrix, $A^{-1}$ is also invertible. This statement is necessarily true.


(D) $A$ is symmetric.

A symmetric matrix is a square matrix $A$ such that its transpose $A'$ is equal to $A$ ($A' = A$). Invertibility is a property related to the determinant and the existence of an inverse, while symmetry is a property related to the relationship between the matrix and its transpose.

There are many invertible matrices that are not symmetric.

For example, consider the matrix $A = \begin{bmatrix} 1 & 2 \\ 0 & 1 \end{bmatrix}$.

Its determinant is $|A| = (1 \times 1) - (2 \times 0) = 1 \neq 0$, so $A$ is invertible.

Its transpose is $A' = \begin{bmatrix} 1 & 0 \\ 2 & 1 \end{bmatrix}$.

Since $A' \neq A$, the matrix $A$ is not symmetric.

Thus, an invertible matrix is not necessarily symmetric.

Therefore, the statement "A is symmetric" is NOT necessarily true if $A$ is an invertible matrix.


The correct option is (D) $A$ is symmetric.

Question 34. Case Study: Three shops (S1, S2, S3) sell two types of sweets, Gulab Jamun (GJ) and Rasgulla (RG). The quantities (in kg) of each sweet sold over two days (Day 1, Day 2) are given by the matrix $Q = \begin{bmatrix} 10 & 12 \\ 15 & 10 \\ 8 & 15 \end{bmatrix}$, where rows represent shops and columns represent days. The price per kg (in $\textsf{₹}$) for each sweet is given by the matrix $P = \begin{bmatrix} 150 & 200 \end{bmatrix}$, where the first element is price of GJ and the second is price of RG.

To find the total revenue for each shop over the two days, the matrix calculation required is:

(A) $QP'$

(B) $PQ'$

(C) $QP$

(D) $P'Q$

Answer:

We are given the quantity matrix $Q$ and the price matrix $P$.

The matrix $Q = \begin{bmatrix} 10 & 12 \\ 15 & 10 \\ 8 & 15 \end{bmatrix}$ has rows representing shops (S1, S2, S3) and columns representing days (Day 1, Day 2). Thus, $Q$ is a $3 \times 2$ matrix.

The matrix $P = \begin{bmatrix} 150 & 200 \end{bmatrix}$ gives the price per kg for GJ ($\textsf{₹} 150$) and RG ($\textsf{₹} 200$). This matrix has 1 row and 2 columns, so it is a $1 \times 2$ matrix. The columns of P represent the sweet types (GJ, RG), while the single row represents the price.

The goal is to find the total revenue for each shop over the two days. Total revenue for a shop is the sum of (quantity of GJ sold $\times$ price of GJ) + (quantity of RG sold $\times$ price of RG), summed over the two days. The output should be a matrix representing the revenue for each of the 3 shops. A $3 \times 1$ matrix would be suitable for this.

Let's analyze the dimensions of the given matrices and the required output.

Order of $Q$ (Shops $\times$ Days) is $3 \times 2$.

Order of $P$ (Price/Type) is $1 \times 2$. The types are GJ and RG.

The transpose of $P$, denoted as $P'$, will have order $2 \times 1$. $P' = \begin{bmatrix} 150 \\ 200 \end{bmatrix}$, where rows represent types (GJ, RG) and the column represents price.

To calculate revenue, we need to multiply quantities of sweets sold by their respective prices. The matrix Q, as described, gives quantities per shop per day. The price matrix P gives prices per sweet type.

It appears there might be a slight inconsistency in the problem statement's description of matrix Q. A typical matrix setup for this type of revenue calculation would have quantities of items (sweets) per source (shops). Let's assume, based on the options and the need to use the price matrix P (which lists prices by sweet type), that the columns of the quantity matrix $Q$ actually represent the quantities of the two sweet types (GJ and RG) sold by each shop, perhaps averaged over the two days or representing total sales over the two days, rather than quantities sold on Day 1 and Day 2 separately in columns.

Let's assume the intended meaning of Q is: rows are Shops (S1, S2, S3) and columns are Quantities of Sweet Types (GJ, RG).

$Q_{Shops \times Types} = \begin{bmatrix} 10 & 12 \\ 15 & 10 \\ 8 & 15 \end{bmatrix}$ (Order $3 \times 2$). Where 10 is GJ from S1, 12 is RG from S1, etc.

The price matrix is $P_{Type \times Price} = \begin{bmatrix} 150 \\ 200 \end{bmatrix}$ (Order $2 \times 1$). This matrix is actually $P'$ from the given options, where rows are Types (GJ, RG) and the column is the Price.

To get the total revenue for each shop, we would multiply the quantity of each sweet by its price and sum them for each shop. This corresponds to matrix multiplication $Q_{Shops \times Types} \times P_{Types \times Price}$.

Let's check the dimensions of the options:

(A) $QP'$: Order of $Q$ is $3 \times 2$. Order of $P'$ is $2 \times 1$. The product $QP'$ has order $(3 \times 2) \times (2 \times 1) = 3 \times 1$. This resulting matrix is a $3 \times 1$ column vector, which can represent the total revenue for each of the 3 shops.

(B) $PQ'$: Order of $P$ is $1 \times 2$. Order of $Q'$ is $2 \times 3$. The product $PQ'$ has order $(1 \times 2) \times (2 \times 3) = 1 \times 3$. This is a row vector and does not represent revenue per shop in the required format.

(C) $QP$: Order of $Q$ is $3 \times 2$. Order of $P$ is $1 \times 2$. Matrix multiplication $QP$ is not defined because the number of columns in $Q$ (2) is not equal to the number of rows in $P$ (1).

(D) $P'Q$: Order of $P'$ is $2 \times 1$. Order of $Q$ is $3 \times 2$. Matrix multiplication $P'Q$ is not defined because the number of columns in $P'$ (1) is not equal to the number of rows in $Q$ (3).

Based on the requirement to obtain a result representing the revenue for each of the three shops (a $3 \times 1$ matrix) and the given options, the only dimensionally compatible and logically consistent operation that results in a $3 \times 1$ matrix is $QP'$. This operation calculates the sum of (quantity of sweet type $j$ by shop $i$) $\times$ (price of sweet type $j$), effectively giving the total revenue for each shop, assuming the columns of Q represent sweet types.


Although the problem statement describes the columns of Q as representing days, which is inconsistent with using the price matrix P (structured by sweet type) to calculate revenue per shop using matrix multiplication, the options provided strongly suggest that the intended calculation involves $QP'$. Assuming the likely intended structure where Q's columns represent sweet quantities per type for each shop, $QP'$ is the correct calculation.


The correct option is (A) $QP'$.

Question 35. If $A = \begin{bmatrix} 1 & 0 \\ 2 & -1 \end{bmatrix}$, find $A^{-1}$.

(A) $\begin{bmatrix} 1 & 0 \\ 2 & -1 \end{bmatrix}$

(B) $\begin{bmatrix} 1 & 0 \\ -2 & -1 \end{bmatrix}$

(C) $\begin{bmatrix} -1 & 0 \\ -2 & 1 \end{bmatrix}$

(D) $\begin{bmatrix} -1 & 0 \\ 2 & 1 \end{bmatrix}$

Answer:

Given:

Matrix $A = \begin{bmatrix} 1 & 0 \\ 2 & -1 \end{bmatrix}$

To Find:

The inverse of matrix $A$, denoted by $A^{-1}$.

Solution:

For a $2 \times 2$ matrix $A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$, its inverse $A^{-1}$ is given by the formula:

$A^{-1} = \frac{1}{|A|} \begin{bmatrix} d & -b \\ -c & a \end{bmatrix}$

where $|A|$ is the determinant of matrix $A$, calculated as $|A| = ad - bc$. The inverse exists if and only if $|A| \neq 0$.

For the given matrix $A = \begin{bmatrix} 1 & 0 \\ 2 & -1 \end{bmatrix}$, we have $a=1$, $b=0$, $c=2$, and $d=-1$.

First, calculate the determinant of $A$:

$|A| = (1)(-1) - (0)(2)$

$|A| = -1 - 0$

$|A| = -1$

Since $|A| = -1 \neq 0$, the matrix $A$ is invertible.

Now, apply the formula for the inverse:

$A^{-1} = \frac{1}{-1} \begin{bmatrix} -1 & -0 \\ -2 & 1 \end{bmatrix}$

$A^{-1} = -1 \begin{bmatrix} -1 & 0 \\ -2 & 1 \end{bmatrix}$

Multiply each element of the adjoint matrix by the scalar -1:

$A^{-1} = \begin{bmatrix} (-1) \times (-1) & (-1) \times 0 \\ (-1) \times (-2) & (-1) \times 1 \end{bmatrix}$

$A^{-1} = \begin{bmatrix} 1 & 0 \\ 2 & -1 \end{bmatrix}$

The inverse of matrix $A$ is $\begin{bmatrix} 1 & 0 \\ 2 & -1 \end{bmatrix}$.


Comparing the result with the given options, we find that it matches option (A).

Note that the inverse of this particular matrix is the matrix itself, which means $A^2 = I$ (the matrix is involutory).


The correct option is (A) $\begin{bmatrix} 1 & 0 \\ 2 & -1 \end{bmatrix}$.

Question 36. If $A = \begin{bmatrix} i & 0 \\ 0 & -i \end{bmatrix}$, where $i^2 = -1$, find $A^4$.

(A) $\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$

(B) $\begin{bmatrix} -1 & 0 \\ 0 & -1 \end{bmatrix}$

(C) $\begin{bmatrix} i & 0 \\ 0 & -i \end{bmatrix}$

(D) $\begin{bmatrix} -i & 0 \\ 0 & i \end{bmatrix}$

Answer:

Given:

Matrix $A = \begin{bmatrix} i & 0 \\ 0 & -i \end{bmatrix}$, where $i^2 = -1$.

To Find:

The matrix $A^4$.

Solution:

The given matrix $A$ is a diagonal matrix, as the non-diagonal elements are zero.

For a diagonal matrix $D = \begin{bmatrix} d_1 & 0 & \dots & 0 \\ 0 & d_2 & \dots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \dots & d_n \end{bmatrix}$, the $k$-th power is obtained by raising each diagonal element to the power of $k$:

$D^k = \begin{bmatrix} d_1^k & 0 & \dots & 0 \\ 0 & d_2^k & \dots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \dots & d_n^k \end{bmatrix}$

In this case, $A = \begin{bmatrix} i & 0 \\ 0 & -i \end{bmatrix}$, the diagonal elements are $d_1 = i$ and $d_2 = -i$. We need to find $A^4$, so $k=4$.

$A^4 = \begin{bmatrix} i^4 & 0 \\ 0 & (-i)^4 \end{bmatrix}$

Now, we calculate the powers of $i$ and $-i$ using the property $i^2 = -1$:

$i^4 = (i^2)^2 = (-1)^2 = 1$

$(-i)^4 = ((-1) \times i)^4 = (-1)^4 \times i^4 = 1 \times (i^2)^2 = 1 \times (-1)^2 = 1 \times 1 = 1$

Substitute these values back into the expression for $A^4$:

$A^4 = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$

This matrix is the $2 \times 2$ identity matrix, $I_2$.


Comparing this result with the given options, we see that it matches option (A).


The correct option is (A) $\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$.

Question 37. If $A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}$, express $A$ as the sum of a symmetric and a skew-symmetric matrix.

(A) $\underbrace{\begin{bmatrix} 1 & 2.5 \\ 2.5 & 4 \end{bmatrix}}_{\text{Symmetric}} + \underbrace{\begin{bmatrix} 0 & -0.5 \\ 0.5 & 0 \end{bmatrix}}_{\text{Skew-Symmetric}}$

(B) $\underbrace{\begin{bmatrix} 1 & 2.5 \\ 2.5 & 4 \end{bmatrix}}_{\text{Symmetric}} + \underbrace{\begin{bmatrix} 0 & 0.5 \\ -0.5 & 0 \end{bmatrix}}_{\text{Skew-Symmetric}}$

(C) $\underbrace{\begin{bmatrix} 2 & 5 \\ 5 & 8 \end{bmatrix}}_{\text{Symmetric}} + \underbrace{\begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix}}_{\text{Skew-Symmetric}}$

(D) $\underbrace{\begin{bmatrix} 2 & 4 \\ 6 & 8 \end{bmatrix}}_{\text{Symmetric}} + \underbrace{\begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}}_{\text{Skew-Symmetric}}$

Answer:

Given:

Matrix $A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}$.

To Express:

$A$ as the sum of a symmetric matrix and a skew-symmetric matrix.

Solution:

Any square matrix $A$ can be uniquely expressed as the sum of a symmetric matrix $P$ and a skew-symmetric matrix $Q$, where:

$P = \frac{1}{2}(A + A')$

(Symmetric part)

$Q = \frac{1}{2}(A - A')$

(Skew-symmetric part)

and $A = P + Q$.

First, find the transpose of $A$, denoted by $A'$:

$A' = \begin{bmatrix} 1 & 3 \\ 2 & 4 \end{bmatrix}$


Now, calculate $A + A'$:

$A + A' = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix} + \begin{bmatrix} 1 & 3 \\ 2 & 4 \end{bmatrix} = \begin{bmatrix} 1+1 & 2+3 \\ 3+2 & 4+4 \end{bmatrix} = \begin{bmatrix} 2 & 5 \\ 5 & 8 \end{bmatrix}$

Calculate the symmetric part $P = \frac{1}{2}(A + A')$:

$P = \frac{1}{2} \begin{bmatrix} 2 & 5 \\ 5 & 8 \end{bmatrix} = \begin{bmatrix} \frac{1}{2} \times 2 & \frac{1}{2} \times 5 \\ \frac{1}{2} \times 5 & \frac{1}{2} \times 8 \end{bmatrix} = \begin{bmatrix} 1 & 2.5 \\ 2.5 & 4 \end{bmatrix}$

Check if $P$ is symmetric: $P' = \begin{bmatrix} 1 & 2.5 \\ 2.5 & 4 \end{bmatrix}' = \begin{bmatrix} 1 & 2.5 \\ 2.5 & 4 \end{bmatrix} = P$. Yes, $P$ is symmetric.


Next, calculate $A - A'$:

$A - A' = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix} - \begin{bmatrix} 1 & 3 \\ 2 & 4 \end{bmatrix} = \begin{bmatrix} 1-1 & 2-3 \\ 3-2 & 4-4 \end{bmatrix} = \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix}$

Calculate the skew-symmetric part $Q = \frac{1}{2}(A - A')$:

$Q = \frac{1}{2} \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix} = \begin{bmatrix} \frac{1}{2} \times 0 & \frac{1}{2} \times (-1) \\ \frac{1}{2} \times 1 & \frac{1}{2} \times 0 \end{bmatrix} = \begin{bmatrix} 0 & -0.5 \\ 0.5 & 0 \end{bmatrix}$

Check if $Q$ is skew-symmetric: $Q' = \begin{bmatrix} 0 & -0.5 \\ 0.5 & 0 \end{bmatrix}' = \begin{bmatrix} 0 & 0.5 \\ -0.5 & 0 \end{bmatrix}$.

$-Q = - \begin{bmatrix} 0 & -0.5 \\ 0.5 & 0 \end{bmatrix} = \begin{bmatrix} 0 & 0.5 \\ -0.5 & 0 \end{bmatrix}$.

Since $Q' = -Q$, Yes, $Q$ is skew-symmetric.


The expression of $A$ as the sum of a symmetric and a skew-symmetric matrix is $A = P + Q$:

$A = \underbrace{\begin{bmatrix} 1 & 2.5 \\ 2.5 & 4 \end{bmatrix}}_{\text{Symmetric}} + \underbrace{\begin{bmatrix} 0 & -0.5 \\ 0.5 & 0 \end{bmatrix}}_{\text{Skew-Symmetric}}$

Check the sum: $\begin{bmatrix} 1 & 2.5 \\ 2.5 & 4 \end{bmatrix} + \begin{bmatrix} 0 & -0.5 \\ 0.5 & 0 \end{bmatrix} = \begin{bmatrix} 1+0 & 2.5-0.5 \\ 2.5+0.5 & 4+0 \end{bmatrix} = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}$, which is the original matrix $A$.


Comparing this result with the given options, we find that it matches option (A).


The correct option is (A) $\underbrace{\begin{bmatrix} 1 & 2.5 \\ 2.5 & 4 \end{bmatrix}}_{\text{Symmetric}} + \underbrace{\begin{bmatrix} 0 & -0.5 \\ 0.5 & 0 \end{bmatrix}}_{\text{Skew-Symmetric}}$.

Question 38. Which of the following conditions must be met for a square matrix $A$ to be invertible? (Select all that apply)

(A) $A$ must be a square matrix.

(B) $|A| \neq 0$.

(C) $A$ must be symmetric.

(D) There must exist a matrix $B$ such that $AB = BA = I$.

Answer:

A square matrix $A$ of order $n$ is said to be invertible (or non-singular) if there exists a square matrix $B$ of the same order $n$ such that $AB = BA = I_n$, where $I_n$ is the identity matrix of order $n$. If such a matrix $B$ exists, it is unique and is called the inverse of $A$, denoted by $A^{-1}$.

Let's evaluate each condition:


(A) $A$ must be a square matrix.

As stated in the definition, the concept of invertibility is defined only for square matrices. Non-square matrices do not have inverses in the sense that $AB = BA = I$.

This condition must be met.


(B) $|A| \neq 0$.

A fundamental theorem in linear algebra states that a square matrix $A$ is invertible if and only if its determinant $|A|$ is non-zero. This is a necessary and sufficient condition for invertibility.

This condition must be met.


(C) $A$ must be symmetric.

A symmetric matrix is one that is equal to its transpose ($A' = A$). While some symmetric matrices are invertible, not all invertible matrices are symmetric, and not all symmetric matrices are invertible (a symmetric matrix is invertible if and only if its determinant is non-zero). For example, the invertible matrix $\begin{bmatrix} 1 & 2 \\ 0 & 1 \end{bmatrix}$ is not symmetric. Conversely, the symmetric matrix $\begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}$ is not invertible ($|A|=0$).

This condition is NOT necessarily met for an invertible matrix.


(D) There must exist a matrix $B$ such that $AB = BA = I$.

This statement is the very definition of an invertible matrix. If such a matrix $B$ exists (which is the inverse $A^{-1}$), then $A$ is invertible. If no such matrix exists, then $A$ is not invertible.

This condition must be met (it is the definition itself).


The conditions that must be met for a square matrix $A$ to be invertible are that it must be a square matrix, its determinant must be non-zero, and there must exist a matrix $B$ such that $AB=BA=I$.

The question asks which conditions must be met for a square matrix $A$ to be invertible. The first part of option (A) is already given in the premise of the question ("a square matrix $A$"). However, typically, when defining invertibility, the matrix being square is part of the overall condition for invertibility to be considered. Assuming the question implies "Given a matrix, what conditions make it an invertible square matrix?", then all aspects should be considered.

If the question means "Given that $A$ is a square matrix, which additional conditions make it invertible?", then (A) is redundant as it's given. But (A) simply states a property of invertible matrices which is true.

Let's interpret the question as listing the properties/conditions associated with an invertible matrix, and select the ones that are always true if a matrix is invertible.

If $A$ is an invertible matrix:

(A) $A$ is a square matrix - Always true by definition.

(B) $|A| \neq 0$ - Always true by the determinant criterion for invertibility.

(C) $A$ is symmetric - Not always true, as shown by example.

(D) There must exist a matrix $B$ such that $AB = BA = I$ - Always true by definition (this matrix $B$ is the inverse $A^{-1}$).

So, the conditions that must be met (are always true) for a matrix to be invertible are (A), (B), and (D).


The correct options are (A) $A$ must be a square matrix., (B) $|A| \neq 0$., and (D) There must exist a matrix $B$ such that $AB = BA = I$.

Question 39. If $A = \begin{bmatrix} 5 \end{bmatrix}$, then $A^{-1}$ is:

(A) $\begin{bmatrix} -5 \end{bmatrix}$

(B) $\begin{bmatrix} 0.2 \end{bmatrix}$

(C) $\begin{bmatrix} 1 \end{bmatrix}$

(D) Does not exist

Answer:

Given:

Matrix $A = \begin{bmatrix} 5 \end{bmatrix}$.

To Find:

The inverse of matrix $A$, denoted by $A^{-1}$.

Solution:

The given matrix $A$ is a $1 \times 1$ matrix. The determinant of a $1 \times 1$ matrix $\begin{bmatrix} a \end{bmatrix}$ is simply the value of the element $a$. In this case, $|A| = 5$.

A square matrix $A$ is invertible if and only if its determinant is non-zero. Since $|A| = 5 \neq 0$, the matrix $A$ is invertible, and its inverse exists.

For a $1 \times 1$ matrix $A = \begin{bmatrix} a \end{bmatrix}$, if $a \neq 0$, the inverse $A^{-1}$ is a $1 \times 1$ matrix $\begin{bmatrix} b \end{bmatrix}$ such that $AB = I$, where $I$ is the $1 \times 1$ identity matrix $\begin{bmatrix} 1 \end{bmatrix}$.

So, $A B = \begin{bmatrix} a \end{bmatrix} \begin{bmatrix} b \end{bmatrix} = \begin{bmatrix} ab \end{bmatrix}$.

We need $\begin{bmatrix} ab \end{bmatrix} = \begin{bmatrix} 1 \end{bmatrix}$, which means $ab = 1$.

The inverse element $b$ is $\frac{1}{a}$.

In this case, $a = 5$. So, the element of the inverse matrix is $b = \frac{1}{5} = 0.2$.

The inverse matrix $A^{-1}$ is $\begin{bmatrix} 0.2 \end{bmatrix}$.

Alternatively, using the formula for the inverse of a $2 \times 2$ matrix applied conceptually to a $1 \times 1$ matrix: $A = \begin{bmatrix} a \end{bmatrix}$. The adjoint of $A$ is $\begin{bmatrix} 1 \end{bmatrix}$ (by convention, or consider the cofactor of the single element is 1, and its transpose is still 1). $|A| = a$. So, $A^{-1} = \frac{1}{|A|} \text{adj}(A) = \frac{1}{a} \begin{bmatrix} 1 \end{bmatrix} = \begin{bmatrix} \frac{1}{a} \end{bmatrix}$.

For $A = \begin{bmatrix} 5 \end{bmatrix}$, $A^{-1} = \begin{bmatrix} \frac{1}{5} \end{bmatrix} = \begin{bmatrix} 0.2 \end{bmatrix}$.


Comparing this result with the given options, we find that it matches option (B).


The correct option is (B) $\begin{bmatrix} 0.2 \end{bmatrix}$.

Question 40. Assertion (A): If $A$ and $B$ are two matrices such that $AB = O$, then either $A = O$ or $B = O$.

Reason (R): In matrix multiplication, the product of two non-zero matrices can be the zero matrix.

(A) Both A and R are true and R is the correct explanation of A.

(B) Both A and R are true but R is not the correct explanation of A.

(C) A is true but R is false.

(D) A is false but R is true.

Answer:

Let's analyze the Assertion and the Reason separately.


Assertion (A): If $A$ and $B$ are two matrices such that $AB = O$, then either $A = O$ or $B = O$.

This statement claims that if the product of two matrices is the zero matrix, then at least one of the matrices must be the zero matrix. This property holds for multiplication of real numbers (if $ab=0$, then $a=0$ or $b=0$), but it is not generally true for matrix multiplication.

Consider the matrices $A = \begin{bmatrix} 1 & 1 \\ 0 & 0 \end{bmatrix}$ and $B = \begin{bmatrix} 1 & 0 \\ -1 & 0 \end{bmatrix}$.

Neither matrix $A$ nor matrix $B$ is the zero matrix ($O$).

Let's calculate their product $AB$:

$AB = \begin{bmatrix} 1 & 1 \\ 0 & 0 \end{bmatrix} \begin{bmatrix} 1 & 0 \\ -1 & 0 \end{bmatrix} = \begin{bmatrix} (1 \times 1) + (1 \times -1) & (1 \times 0) + (1 \times 0) \\ (0 \times 1) + (0 \times -1) & (0 \times 0) + (0 \times 0) \end{bmatrix} = \begin{bmatrix} 1 - 1 & 0 + 0 \\ 0 + 0 & 0 + 0 \end{bmatrix} = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$

Here, $AB = O$, but $A \neq O$ and $B \neq O$. This is a counterexample to the assertion.

Therefore, the Assertion (A) is False.


Reason (R): In matrix multiplication, the product of two non-zero matrices can be the zero matrix.

This statement claims that it is possible to multiply two matrices that are not the zero matrix and obtain the zero matrix as the result. The example provided in the analysis of Assertion (A) serves as proof for this statement.

Using $A = \begin{bmatrix} 1 & 1 \\ 0 & 0 \end{bmatrix}$ and $B = \begin{bmatrix} 1 & 0 \\ -1 & 0 \end{bmatrix}$, we showed that $A \neq O$, $B \neq O$, and $AB = O$.

Therefore, the Reason (R) is True.


Relationship between Assertion and Reason:

The Assertion makes a claim that is incorrect. The Reason states a fact about matrix multiplication (the possibility of zero product from non-zero matrices) that directly contradicts the Assertion.

Since Assertion (A) is false and Reason (R) is true, the correct option is (D).


The correct option is (D) A is false but R is true.

Question 41. If an elementary operation is applied to a matrix $A$, it is equivalent to pre-multiplying $A$ by the corresponding ____ matrix.

(A) Inverse

(B) Transpose

(C) Elementary

(D) Diagonal

Answer:

This question relates elementary row operations (and column operations) to matrix multiplication.

An elementary matrix is a matrix obtained by performing a single elementary row operation on an identity matrix. Let's call this elementary matrix $E$.

A key property in matrix theory is that applying an elementary row operation to a matrix $A$ is equivalent to pre-multiplying $A$ by the elementary matrix $E$ that is obtained by applying the same elementary row operation to the identity matrix $I$ of the appropriate size.

Specifically, if an elementary row operation $R$ is applied to matrix $A$, and $E$ is the elementary matrix obtained by applying the same operation $R$ to the identity matrix $I$, then $R(A) = E A$. Here, $E$ is of the same order as the number of rows in $A$, and $I$ is the identity matrix of that order.

For example, let $A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$ and $I = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$.

Apply the row operation $R_1 \leftrightarrow R_2$ to $A$: The resulting matrix is $\begin{bmatrix} c & d \\ a & b \end{bmatrix}$.

Apply the same row operation $R_1 \leftrightarrow R_2$ to the identity matrix $I$: The elementary matrix is $E = \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}$.

Now, pre-multiply $A$ by $E$:

$E A = \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} \begin{bmatrix} a & b \\ c & d \end{bmatrix} = \begin{bmatrix} (0 \times a) + (1 \times c) & (0 \times b) + (1 \times d) \\ (1 \times a) + (0 \times c) & (1 \times b) + (0 \times d) \end{bmatrix} = \begin{bmatrix} c & d \\ a & b \end{bmatrix}$

The result matches the matrix obtained by directly applying the row operation to $A$.

Similarly, applying an elementary column operation to a matrix $A$ is equivalent to post-multiplying $A$ by the corresponding elementary matrix $E_c$ obtained by applying the same elementary column operation to the identity matrix $I$. If $C$ is an elementary column operation applied to $A$, and $E_c$ is the elementary matrix from applying $C$ to $I$, then $C(A) = A E_c$. Here, $E_c$ is of the same order as the number of columns in $A$, and $I$ is the identity matrix of that order.

The question specifies "pre-multiplying", which corresponds to elementary row operations.

Let's consider the options:

(A) Inverse: While elementary matrices are invertible, the statement refers to the type of matrix used for pre-multiplication, which is an elementary matrix.

(B) Transpose: Pre-multiplying by a transpose matrix does not represent elementary row operations in general.

(C) Elementary: This matches the definition and property described above.

(D) Diagonal: Pre-multiplying by a diagonal matrix (with non-zero diagonal elements) corresponds to scaling the rows of $A$ by the diagonal entries. This is only one specific type of elementary row operation (scaling a row), not all types.

Therefore, if an elementary row operation is applied to a matrix $A$, it is equivalent to pre-multiplying $A$ by the corresponding elementary matrix.


The correct option is (C) Elementary.

Question 42. If $A$ is a square matrix such that $A^2 = I$, then $A$ is called a/an ____ matrix.

(A) Idempotent

(B) Nilpotent

(C) Involutory

(D) Orthogonal (Note: While involutory includes some orthogonal, the definition fits best here)

Answer:

We are given a square matrix $A$ such that $A^2 = I$, where $I$ is the identity matrix of the same order as $A$. We need to identify the type of matrix that satisfies this condition.

Let's look at the definitions of the matrix types given in the options:

(A) Idempotent matrix: A square matrix $A$ is idempotent if $A^2 = A$.

The given condition is $A^2 = I$. For an idempotent matrix (where $A^2 = A$) to satisfy the given condition ($A^2 = I$), we would need $A = I$. So, the identity matrix is an idempotent matrix that satisfies the given condition, but not all matrices satisfying $A^2=I$ are the identity matrix (e.g., $\begin{bmatrix} -1 & 0 \\ 0 & -1 \end{bmatrix}$). Thus, the definition of idempotent matrix does not match the given condition directly.

(B) Nilpotent matrix: A square matrix $A$ is nilpotent if there exists a positive integer $k$ such that $A^k = O$, where $O$ is the zero matrix. The smallest such $k$ is called the index of nilpotency.

The given condition is $A^2 = I$. This involves the identity matrix, not the zero matrix. If $A^k = O$, then $|A^k| = |O| = 0$, which implies $|A|^k = 0$, so $|A| = 0$. A nilpotent matrix is always singular. An involutory matrix ($A^2=I$) has determinant $|A^2|=|I|$, so $|A|^2=1$, meaning $|A|=\pm 1$. An involutory matrix is always non-singular. Thus, a matrix satisfying $A^2=I$ cannot be a nilpotent matrix (except for the trivial case of a $0 \times 0$ matrix, which isn't typically considered).

(C) Involutory matrix: A square matrix $A$ is involutory if $A^2 = I$.

This definition exactly matches the given condition $A^2 = I$.

(D) Orthogonal matrix: A square matrix $A$ is orthogonal if $A A' = A' A = I$, where $A'$ is the transpose of $A$.

If a matrix is involutory ($A^2=I$) and also symmetric ($A'=A$), then $AA' = AA = A^2 = I$. So, a symmetric involutory matrix is orthogonal. However, not all involutory matrices are symmetric (e.g., $\begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix}$ is symmetric and involutory, but $\begin{bmatrix} -1 & 0 \\ 0 & 1 \end{bmatrix}$ is also symmetric and involutory, while $\begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}$ is symmetric and involutory. Consider a more general case like a reflection matrix). An involutory matrix is not necessarily orthogonal. For instance, if $A = \begin{bmatrix} -1 & 0 \\ 0 & -1 \end{bmatrix}$, then $A^2 = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} = I$, so $A$ is involutory. $A' = \begin{bmatrix} -1 & 0 \\ 0 & -1 \end{bmatrix} = A$. $AA' = A^2 = I$. This one is orthogonal. But consider $A = \begin{bmatrix} -1 & 0 \\ 0 & 1 \end{bmatrix}$. $A^2 = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} = I$. $A$ is involutory. $A' = \begin{bmatrix} -1 & 0 \\ 0 & 1 \end{bmatrix}$. $AA' = \begin{bmatrix} -1 & 0 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} -1 & 0 \\ 0 & 1 \end{bmatrix} = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} = I$. This one is also orthogonal. The statement in the option note suggests some involutory matrices are orthogonal, which is true, but the definitions are distinct. The condition $A^2=I$ is the definition of an involutory matrix.

The note in option (D) acknowledges that some involutory matrices are orthogonal, but emphasizes that the definition for the given condition $A^2=I$ specifically corresponds to an involutory matrix.


The condition $A^2 = I$ is the definition of an involutory matrix.


The correct option is (C) Involutory.

Question 43. The matrix $\begin{bmatrix} 1 & 2 & 3 \\ 0 & 4 & 5 \\ 0 & 0 & 6 \end{bmatrix}$ is a:

(A) Lower triangular matrix

(B) Upper triangular matrix

(C) Diagonal matrix

(D) Scalar matrix

Answer:

Let the given matrix be $A = \begin{bmatrix} 1 & 2 & 3 \\ 0 & 4 & 5 \\ 0 & 0 & 6 \end{bmatrix}$. We need to classify this matrix based on its structure.

The elements of the matrix are $a_{ij}$, where $i$ is the row index and $j$ is the column index.


Definition of Triangular Matrices:

A square matrix $A = [a_{ij}]$ is a lower triangular matrix if all the elements above the main diagonal are zero, i.e., $a_{ij} = 0$ for all $i < j$.

A square matrix $A = [a_{ij}]$ is an upper triangular matrix if all the elements below the main diagonal are zero, i.e., $a_{ij} = 0$ for all $i > j$.


Let's look at the elements of matrix $A$ relative to the main diagonal (the elements where $i=j$, which are 1, 4, and 6).

Elements below the main diagonal ($i > j$):

  • $a_{21}$ (row 2, column 1): 0 (Here $i=2, j=1$, $i > j$)
  • $a_{31}$ (row 3, column 1): 0 (Here $i=3, j=1$, $i > j$)
  • $a_{32}$ (row 3, column 2): 0 (Here $i=3, j=2$, $i > j$)

All elements below the main diagonal are zero. This matches the definition of an upper triangular matrix.

Elements above the main diagonal ($i < j$):

  • $a_{12}$ (row 1, column 2): 2 (Here $i=1, j=2$, $i < j$)
  • $a_{13}$ (row 1, column 3): 3 (Here $i=1, j=3$, $i < j$)
  • $a_{23}$ (row 2, column 3): 5 (Here $i=2, j=3$, $i < j$)

Since some elements above the main diagonal are non-zero (2, 3, 5), the matrix is not a lower triangular matrix (unless it's also a diagonal matrix, which is a special case of both).


Definition of Diagonal Matrix: A square matrix is diagonal if all its non-diagonal elements are zero ($a_{ij} = 0$ for $i \neq j$).

Since some non-diagonal elements (2, 3, 5) are non-zero, the matrix is not a diagonal matrix.


Definition of Scalar Matrix: A scalar matrix is a diagonal matrix where all diagonal elements are equal. It is a special case of a diagonal matrix.

Since the matrix is not a diagonal matrix, it cannot be a scalar matrix.


Based on the elements being zero below the main diagonal, the matrix is an upper triangular matrix.


The correct option is (B) Upper triangular matrix.

Question 44. If $A = \begin{bmatrix} \cos x & \sin x \\ -\sin x & \cos x \end{bmatrix}$, then $A A'$ is equal to:

(A) $\begin{bmatrix} \cos^2 x & -\sin^2 x \\ -\sin^2 x & \cos^2 x \end{bmatrix}$

(B) $\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$

(C) $\begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}$

(D) $\begin{bmatrix} \cos 2x & \sin 2x \\ -\sin 2x & \cos 2x \end{bmatrix}$

Answer:

Given:

Matrix $A = \begin{bmatrix} \cos x & \sin x \\ -\sin x & \cos x \end{bmatrix}$.

To Find:

The matrix product $A A'$.

Solution:

First, find the transpose of matrix $A$, denoted by $A'$. The transpose is obtained by interchanging the rows and columns of $A$.

$A' = \begin{bmatrix} \cos x & -\sin x \\ \sin x & \cos x \end{bmatrix}$

Now, calculate the matrix product $A A'$:

$A A' = \begin{bmatrix} \cos x & \sin x \\ -\sin x & \cos x \end{bmatrix} \begin{bmatrix} \cos x & -\sin x \\ \sin x & \cos x \end{bmatrix}$

Let $A A' = B = \begin{bmatrix} b_{11} & b_{12} \\ b_{21} & b_{22} \end{bmatrix}$. We calculate the elements of $B$ by multiplying the rows of $A$ by the columns of $A'$ and summing the products.

$b_{11} = (\cos x)(\cos x) + (\sin x)(\sin x) = \cos^2 x + \sin^2 x$

Using the trigonometric identity $\cos^2 x + \sin^2 x = 1$:

$b_{11} = 1$

$b_{12} = (\cos x)(-\sin x) + (\sin x)(\cos x) = -\cos x \sin x + \sin x \cos x = 0$

$b_{21} = (-\sin x)(\cos x) + (\cos x)(\sin x) = -\sin x \cos x + \cos x \sin x = 0$

$b_{22} = (-\sin x)(-\sin x) + (\cos x)(\cos x) = \sin^2 x + \cos^2 x$

Using the trigonometric identity $\sin^2 x + \cos^2 x = 1$:

$b_{22} = 1$

Substitute these values into the matrix $B$:

$A A' = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$

This matrix is the $2 \times 2$ identity matrix, $I_2$.

Matrices of the form $\begin{bmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{bmatrix}$ or $\begin{bmatrix} \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{bmatrix}$ represent rotations in a 2D plane. A property of rotation matrices (and more generally, orthogonal matrices) is that their product with their transpose is the identity matrix. In this case, $A = \begin{bmatrix} \cos x & \sin x \\ -\sin x & \cos x \end{bmatrix}$ is the transpose of a standard rotation matrix $\begin{bmatrix} \cos x & -\sin x \\ \sin x & \cos x \end{bmatrix}$, or a reflection followed by a rotation. Regardless of the geometric interpretation, the calculation of $AA'$ confirms the result.


Comparing the calculated result with the given options, we find that it matches option (B).


The correct option is (B) $\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$.

Question 45. If $A$ is a square matrix of order $n$ and $|A| \neq 0$, then the order of $A^{-1}$ is:

(A) $n \times n$

(B) $1 \times 1$

(C) $n \times 1$

(D) $1 \times n$

Answer:

Given:

Matrix $A$ is a square matrix of order $n$. This means the order of $A$ is $n \times n$ (having $n$ rows and $n$ columns).

The determinant of $A$ is non-zero, i.e., $|A| \neq 0$. This condition ensures that the inverse of $A$ exists.

To Find:

The order of the inverse matrix $A^{-1}$.

Solution:

By the definition of the inverse of a matrix, if $A$ is a square matrix of order $n$, its inverse $A^{-1}$ is a matrix of the same order $n$ such that:

$A A^{-1} = I_n$

$A^{-1} A = I_n$

where $I_n$ is the identity matrix of order $n$. The identity matrix $I_n$ is an $n \times n$ matrix.

For matrix multiplication to be defined, the number of columns in the first matrix must equal the number of rows in the second matrix. The resulting matrix has the number of rows of the first matrix and the number of columns of the second matrix.

Let the order of $A^{-1}$ be $p \times q$.

Consider the product $A A^{-1} = I_n$:

  • Order of $A$ is $n \times n$.
  • Order of $A^{-1}$ is $p \times q$.
  • Order of $I_n$ is $n \times n$.

For the product $A A^{-1}$ to be defined, the number of columns in $A$ must equal the number of rows in $A^{-1}$.

$n = p$

The order of the product $A A^{-1}$ is $n \times q$. Since $A A^{-1} = I_n$, the order of $A A^{-1}$ must be equal to the order of $I_n$.

$n \times q = n \times n$

This implies $q = n$.

From the condition $A A^{-1} = I_n$, we found that the order of $A^{-1}$ must be $n \times n$.

We can also verify this using the product $A^{-1} A = I_n$:

  • Order of $A^{-1}$ is $p \times q$.
  • Order of $A$ is $n \times n$.
  • Order of $I_n$ is $n \times n$.

For the product $A^{-1} A$ to be defined, the number of columns in $A^{-1}$ must equal the number of rows in $A$.

$q = n$

The order of the product $A^{-1} A$ is $p \times n$. Since $A^{-1} A = I_n$, the order of $A^{-1} A$ must be equal to the order of $I_n$.

$p \times n = n \times n$

This implies $p = n$.

Both products confirm that the order of $A^{-1}$ must be $n \times n$.

Also, recall the formula for the inverse of a matrix using the adjoint: $A^{-1} = \frac{1}{|A|} \text{adj}(A)$. The adjoint of a square matrix of order $n$ is also a square matrix of order $n$. Multiplying a matrix by a non-zero scalar (like $\frac{1}{|A|}$) does not change its order. Therefore, the order of $A^{-1}$ is the same as the order of $\text{adj}(A)$, which is $n \times n$. This requires $|A| \neq 0$ for the scalar $\frac{1}{|A|}$ to be defined.


The order of $A^{-1}$ is $n \times n$.


The correct option is (A) $n \times n$.

Question 46. The matrix $\begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \end{bmatrix}$ is a ____ matrix.

(A) Square

(B) Row

(C) Column

(D) Rectangular

Answer:

Let the given matrix be $A = \begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \end{bmatrix}$. We need to classify this matrix based on its dimensions.

The order of a matrix is given by the number of rows $\times$ the number of columns.

Matrix $A$ has 2 rows and 3 columns. So, its order is $2 \times 3$.


Let's consider the definitions of the matrix types given in the options:

(A) Square matrix: A matrix is square if the number of rows is equal to the number of columns ($m = n$).

For matrix $A$, the number of rows is 2 and the number of columns is 3. Since $2 \neq 3$, matrix $A$ is not a square matrix.

(B) Row matrix: A matrix is a row matrix if it has only one row ($m = 1$).

Matrix $A$ has 2 rows. Since $2 \neq 1$, matrix $A$ is not a row matrix.

(C) Column matrix: A matrix is a column matrix if it has only one column ($n = 1$).

Matrix $A$ has 3 columns. Since $3 \neq 1$, matrix $A$ is not a column matrix.

(D) Rectangular matrix: A matrix is a rectangular matrix if the number of rows is not equal to the number of columns ($m \neq n$), or if it has more than one row and more than one column. More generally, any matrix can be considered a rectangular matrix (including square matrices, which are a special case of rectangles). However, in the context of distinguishing from square, row, or column matrices, "rectangular" often implies that it is neither square nor a pure row or column matrix.

In the given matrix, the number of rows (2) is not equal to the number of columns (3). Thus, it fits the description of a rectangular matrix where $m \neq n$.

While a square matrix is a special case of a rectangular matrix, when options include both "Square" and "Rectangular", and the matrix is not square, "Rectangular" (specifically when $m \neq n$) is the most appropriate classification among the given choices that describes the matrix's dimensions.


The given matrix has 2 rows and 3 columns, and the number of rows is not equal to the number of columns. It is therefore a rectangular matrix.


The correct option is (D) Rectangular.

Question 47. If $A$ and $B$ are two matrices such that $AB = BA$, then $A$ and $B$ are said to:

(A) Commute

(B) Anti-commute

(C) Be orthogonal

(D) Be singular

Answer:

We are given that $A$ and $B$ are two matrices such that their product in one order ($AB$) is equal to their product in the reverse order ($BA$). This property is related to the commutativity of matrix multiplication.


Let's look at the definitions related to the given options:

(A) Commute: Two matrices $A$ and $B$ are said to commute with respect to multiplication if $AB = BA$. This is the standard definition of matrices commuting under multiplication. For $AB$ and $BA$ to be compared, both products must be defined and have the same order. This typically occurs when $A$ and $B$ are square matrices of the same order.

(B) Anti-commute: Two matrices $A$ and $B$ are said to anti-commute if $AB = -BA$. This is different from the given condition $AB = BA$.

(C) Be orthogonal: An orthogonal matrix is a square matrix $A$ such that $A A' = A' A = I$. This property is related to the transpose and the identity matrix, not directly to the relationship between the products $AB$ and $BA$ of two arbitrary matrices $A$ and $B$.

(D) Be singular: A square matrix is singular if its determinant is zero ($|A|=0$). This property is about a single matrix and its determinant, not about the relationship between the products of two matrices.

The condition $AB = BA$ is precisely the definition of matrices $A$ and $B$ commuting with respect to multiplication.


Therefore, if $A$ and $B$ are two matrices such that $AB = BA$, then $A$ and $B$ are said to commute.


The correct option is (A) Commute.

Question 48. If $A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$ and $ad - bc \neq 0$, then $A^{-1} = \frac{1}{ad-bc} \begin{bmatrix} \dots \end{bmatrix}$. Complete the matrix.

(A) $\begin{bmatrix} a & -b \\ -c & d \end{bmatrix}$

(B) $\begin{bmatrix} d & -b \\ -c & a \end{bmatrix}$

(C) $\begin{bmatrix} -d & b \\ c & -a \end{bmatrix}$

(D) $\begin{bmatrix} d & c \\ b & a \end{bmatrix}$

Answer:

Given:

A $2 \times 2$ matrix $A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$ and the condition $ad - bc \neq 0$.

To Complete:

The matrix in the formula for the inverse $A^{-1} = \frac{1}{ad-bc} \begin{bmatrix} \dots \end{bmatrix}$.

Solution:

The term $ad - bc$ is the determinant of the matrix $A$, denoted as $|A|$. The condition $|A| \neq 0$ indicates that the matrix $A$ is invertible.

The formula for the inverse of a $2 \times 2$ matrix $A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$ is given by:

$A^{-1} = \frac{1}{|A|} \text{adj}(A)$

where $\text{adj}(A)$ is the adjoint of matrix $A$. For a $2 \times 2$ matrix, the adjoint is obtained by swapping the diagonal elements and negating the non-diagonal elements.

The adjoint of $A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$ is $\text{adj}(A) = \begin{bmatrix} d & -b \\ -c & a \end{bmatrix}$.

Substituting the determinant and the adjoint into the inverse formula:

$A^{-1} = \frac{1}{ad-bc} \begin{bmatrix} d & -b \\ -c & a \end{bmatrix}$

The matrix that completes the expression is $\begin{bmatrix} d & -b \\ -c & a \end{bmatrix}$.


Comparing this matrix with the given options, we find that it matches option (B).


The correct option is (B) $\begin{bmatrix} d & -b \\ -c & a \end{bmatrix}$.

Question 49. Consider the statement: "Every symmetric matrix is a diagonal matrix." This statement is:

(A) Always true

(B) Always false

(C) True only if the matrix is also skew-symmetric

(D) True only if the matrix is of order $1 \times 1$

Answer:

Let's analyze the definitions of symmetric and diagonal matrices.

A square matrix $A = [a_{ij}]$ is symmetric if $A' = A$, which means $a_{ij} = a_{ji}$ for all $i, j$.

A square matrix $A = [a_{ij}]$ is a diagonal matrix if all its non-diagonal elements are zero, i.e., $a_{ij} = 0$ for all $i \neq j$.


The statement is "Every symmetric matrix is a diagonal matrix." We need to determine if this statement is always true, always false, or true under specific conditions.

Let's consider some examples of symmetric matrices:

Consider the $2 \times 2$ matrix $A = \begin{bmatrix} 1 & 2 \\ 2 & 3 \end{bmatrix}$. This matrix is symmetric because $A' = \begin{bmatrix} 1 & 2 \\ 2 & 3 \end{bmatrix}' = \begin{bmatrix} 1 & 2 \\ 2 & 3 \end{bmatrix} = A$.

However, this matrix is not a diagonal matrix because the non-diagonal elements (2 and 2) are not zero.

This single counterexample is enough to show that the statement "Every symmetric matrix is a diagonal matrix" is not always true.


Let's examine the conditions under which a symmetric matrix would also be a diagonal matrix.

If a matrix is both symmetric ($a_{ij} = a_{ji}$) and diagonal ($a_{ij} = 0$ for $i \neq j$), then for $i \neq j$, $a_{ij}$ must be 0 (from the diagonal definition) and $a_{ji}$ must also be 0 (since $a_{ji} = a_{ij}$). The diagonal elements ($a_{ii}$) are not constrained by the definition of a diagonal matrix (they can be any value). A symmetric matrix is diagonal if and only if all its off-diagonal elements are zero, which is the definition of a diagonal matrix. So, a matrix is symmetric and diagonal if and only if it is diagonal (and square).

Consider the given options:

(A) Always true: False, as shown by the counterexample.

(B) Always false: False. For example, the identity matrix $I = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$ is symmetric ($I' = I$) and is also a diagonal matrix. The zero matrix $O = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$ is symmetric and diagonal. A $1 \times 1$ matrix $\begin{bmatrix} a \end{bmatrix}$ is always symmetric ($\begin{bmatrix} a \end{bmatrix}' = \begin{bmatrix} a \end{bmatrix}$) and is also a diagonal matrix (as it has no non-diagonal elements).

(C) True only if the matrix is also skew-symmetric: A matrix that is both symmetric and skew-symmetric must satisfy $A' = A$ and $A' = -A$. This implies $A = -A$, which means $2A = O$, so $A = O$ (the zero matrix). The zero matrix is both symmetric and diagonal. So, if a matrix is both symmetric and skew-symmetric, it is the zero matrix, which is indeed diagonal. However, the statement is "Every symmetric matrix is a diagonal matrix". This is not true just because the zero matrix is both. There are many symmetric matrices that are not diagonal and not skew-symmetric (like $\begin{bmatrix} 1 & 2 \\ 2 & 3 \end{bmatrix}$).

(D) True only if the matrix is of order $1 \times 1$: A $1 \times 1$ matrix $A = \begin{bmatrix} a \end{bmatrix}$ has $A' = \begin{bmatrix} a \end{bmatrix}$. Since $A' = A$, it is always symmetric. A $1 \times 1$ matrix $\begin{bmatrix} a \end{bmatrix}$ has no non-diagonal elements, so it is always a diagonal matrix by definition. Thus, a $1 \times 1$ symmetric matrix is always a diagonal matrix. For matrices of order greater than $1 \times 1$, we have found counterexamples (like $\begin{bmatrix} 1 & 2 \\ 2 & 3 \end{bmatrix}$). So, the statement is true if and only if the matrix is of order $1 \times 1$.


The statement "Every symmetric matrix is a diagonal matrix" is true only for $1 \times 1$ matrices.


The correct option is (D) True only if the matrix is of order $1 \times 1$.

Question 50. If $A$ is a square matrix of order $n$, then $|kA|$ is equal to:

(A) $k|A|$

(B) $k^n|A|$

(C) $n|A|$

(D) $k^2|A|$

Answer:

Given:

Matrix $A$ is a square matrix of order $n$. This means $A$ is an $n \times n$ matrix.

$k$ is a scalar.

To Find:

The determinant of the matrix $kA$, denoted by $|kA|$.

Solution:

The matrix $kA$ is obtained by multiplying every element of the matrix $A$ by the scalar $k$. If $A = [a_{ij}]$, then $kA = [ka_{ij}]$.

Let's consider the determinant of $kA$. The determinant of an $n \times n$ matrix is calculated as a sum of $n!$ terms. Each term is a product of $n$ elements, with one element taken from each row and each column, multiplied by a sign determined by the permutation of column indices.

For a general $n \times n$ matrix $A = \begin{bmatrix} a_{11} & a_{12} & \dots & a_{1n} \\ a_{21} & a_{22} & \dots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{n1} & a_{n2} & \dots & a_{nn} \end{bmatrix}$, its determinant is often defined using cofactor expansion or by the Leibniz formula:

$|A| = \sum_{\sigma \in S_n} (\text{sgn}(\sigma) \prod_{i=1}^n a_{i, \sigma(i)})$

where $S_n$ is the set of all permutations of $\{1, 2, \dots, n\}$, $\text{sgn}(\sigma)$ is the sign of the permutation $\sigma$, and $\sigma(i)$ is the column index for the element in the $i$-th row.

Now consider the matrix $kA = \begin{bmatrix} ka_{11} & ka_{12} & \dots & ka_{1n} \\ ka_{21} & ka_{22} & \dots & ka_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ ka_{n1} & ka_{n2} & \dots & ka_{nn} \end{bmatrix}$.

The determinant of $kA$ is:

$|kA| = \sum_{\sigma \in S_n} (\text{sgn}(\sigma) \prod_{i=1}^n (ka_{i, \sigma(i)}))$

We can factor out the scalar $k$ from each term in the product $\prod_{i=1}^n (ka_{i, \sigma(i)})$. There are $n$ terms in the product, each containing one factor of $k$.

$\prod_{i=1}^n (ka_{i, \sigma(i)}) = (ka_{1, \sigma(1)})(ka_{2, \sigma(2)}) \dots (ka_{n, \sigma(n)}) = k^n (a_{1, \sigma(1)} a_{2, \sigma(2)} \dots a_{n, \sigma(n)}) = k^n \prod_{i=1}^n a_{i, \sigma(i)}$

Substitute this back into the determinant formula for $kA$:

$|kA| = \sum_{\sigma \in S_n} (\text{sgn}(\sigma) k^n \prod_{i=1}^n a_{i, \sigma(i)})$

We can factor out the common scalar $k^n$ from the summation:

$|kA| = k^n \sum_{\sigma \in S_n} (\text{sgn}(\sigma) \prod_{i=1}^n a_{i, \sigma(i)})$

The remaining summation is the determinant of matrix $A$, $|A|$.

Therefore, $|kA| = k^n |A|$.


This property states that if a matrix is scaled by a factor $k$, its determinant is scaled by $k$ raised to the power of the matrix's order.

Comparing this result with the given options, we find that it matches option (B).


The correct option is (B) $k^n|A|$.



Short Answer Type Questions

Question 1. Construct a $3 \times 2$ matrix whose elements are given by $a_{ij} = \frac{|i-j|}{2}$.

Answer:

A $3 \times 2$ matrix has 3 rows and 2 columns. Let the matrix be $A = [a_{ij}]$.

The elements of the matrix are given by the formula $a_{ij} = \frac{|i-j|}{2}$.


We need to calculate the elements for $i \in \{1, 2, 3\}$ and $j \in \{1, 2\}$.

For $i=1$:

$a_{11} = \frac{|1-1|}{2} = \frac{|0|}{2} = \frac{0}{2} = 0$

$a_{12} = \frac{|1-2|}{2} = \frac{|-1|}{2} = \frac{1}{2}$


For $i=2$:

$a_{21} = \frac{|2-1|}{2} = \frac{|1|}{2} = \frac{1}{2}$

$a_{22} = \frac{|2-2|}{2} = \frac{|0|}{2} = \frac{0}{2} = 0$


For $i=3$:

$a_{31} = \frac{|3-1|}{2} = \frac{|2|}{2} = \frac{2}{2} = 1$

$a_{32} = \frac{|3-2|}{2} = \frac{|1|}{2} = \frac{1}{2}$


Thus, the required $3 \times 2$ matrix is:

$A = \begin{bmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \\ a_{31} & a_{32} \end{bmatrix} = \begin{bmatrix} 0 & \frac{1}{2} \\ \frac{1}{2} & 0 \\ 1 & \frac{1}{2} \end{bmatrix}$

Question 2. If a matrix has 18 elements, what are the possible orders it can have?

Answer:

Let the order of the matrix be $m \times n$, where $m$ is the number of rows and $n$ is the number of columns.

The total number of elements in a matrix of order $m \times n$ is given by the product $m \times n$.


Given that the matrix has 18 elements, we have:

$m \times n = 18$

We need to find all possible pairs of positive integers $(m, n)$ whose product is 18.


The pairs of positive integers whose product is 18 are the pairs of factors of 18.

The factors of 18 are 1, 2, 3, 6, 9, and 18.


The possible pairs $(m, n)$ such that $m \times n = 18$ are:

$(1, 18)$

$(18, 1)$

$(2, 9)$

$(9, 2)$

$(3, 6)$

$(6, 3)$


Each pair $(m, n)$ corresponds to a possible order $m \times n$ for the matrix.

Therefore, the possible orders are:

$1 \times 18$

$18 \times 1$

$2 \times 9$

$9 \times 2$

$3 \times 6$

$6 \times 3$


There are 6 possible orders for a matrix with 18 elements.

Question 3. If $\begin{pmatrix} x+y & 2 \\ 5+z & xy \end{pmatrix} = \begin{pmatrix} 6 & 2 \\ 5 & 8 \end{pmatrix}$, find the values of $x, y, z$.

Answer:

Two matrices are equal if and only if their corresponding elements are equal.

Equating the corresponding elements of the given matrices, we get the following system of equations:

$x+y = 6$

$2 = 2$

$5+z = 5$

$xy = 8$


From the equation $5+z = 5$, we can solve for $z$:

$z = 5 - 5$

$z = 0$


Now we need to solve the system of equations involving $x$ and $y$:

$x+y = 6$

$xy = 8$


From the first equation, we can express $y$ in terms of $x$:

$y = 6 - x$

Substitute this expression for $y$ into the second equation:

$x(6 - x) = 8$

$6x - x^2 = 8$


Rearranging the terms to form a quadratic equation:

$x^2 - 6x + 8 = 0$


We can solve this quadratic equation by factoring:

We look for two numbers that multiply to 8 and add up to -6. These numbers are -2 and -4.

So, the equation can be factored as:

$(x - 2)(x - 4) = 0$


This gives two possible values for $x$:

$x - 2 = 0 \implies x = 2$

or

$x - 4 = 0 \implies x = 4$


Now we find the corresponding values for $y$ using the relation $y = 6 - x$:

If $x = 2$, then $y = 6 - 2 = 4$.

If $x = 4$, then $y = 6 - 4 = 2$.


In both cases, $z = 0$.

Thus, the possible values for $(x, y, z)$ are $(2, 4, 0)$ and $(4, 2, 0)$.

The values of $x, y, z$ are either $x=2, y=4, z=0$ or $x=4, y=2, z=0$.

Question 4. If $A = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}$ and $B = \begin{pmatrix} -1 & 0 \\ 2 & 5 \end{pmatrix}$, find $A+B$ and $A-B$.

Answer:

To find the sum of two matrices $A$ and $B$ of the same order, we add their corresponding elements.

Given matrices are $A = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}$ and $B = \begin{pmatrix} -1 & 0 \\ 2 & 5 \end{pmatrix}$. Both matrices are of order $2 \times 2$, so addition is possible.


Calculation for $A+B$:

$A+B = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix} + \begin{pmatrix} -1 & 0 \\ 2 & 5 \end{pmatrix}$

$A+B = \begin{pmatrix} 1 + (-1) & 2 + 0 \\ 3 + 2 & 4 + 5 \end{pmatrix}$

$A+B = \begin{pmatrix} 0 & 2 \\ 5 & 9 \end{pmatrix}$


To find the difference of two matrices $A$ and $B$ of the same order, we subtract the elements of $B$ from the corresponding elements of $A$.


Calculation for $A-B$:

$A-B = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix} - \begin{pmatrix} -1 & 0 \\ 2 & 5 \end{pmatrix}$

$A-B = \begin{pmatrix} 1 - (-1) & 2 - 0 \\ 3 - 2 & 4 - 5 \end{pmatrix}$

$A-B = \begin{pmatrix} 1 + 1 & 2 \\ 1 & -1 \end{pmatrix}$

$A-B = \begin{pmatrix} 2 & 2 \\ 1 & -1 \end{pmatrix}$


Thus, the required matrices are:

$A+B = \begin{pmatrix} 0 & 2 \\ 5 & 9 \end{pmatrix}$

$A-B = \begin{pmatrix} 2 & 2 \\ 1 & -1 \end{pmatrix}$

Question 5. If $A = \begin{pmatrix} 2 & 3 \\ 4 & 1 \end{pmatrix}$ and $k = 3$, find $kA$.

Answer:

To find the scalar product of a matrix by a scalar, we multiply each element of the matrix by that scalar.

Given matrix $A = \begin{pmatrix} 2 & 3 \\ 4 & 1 \end{pmatrix}$ and scalar $k = 3$.


We need to find $kA$.

$kA = 3 \times \begin{pmatrix} 2 & 3 \\ 4 & 1 \end{pmatrix}$


Multiply each element of matrix $A$ by 3:

$kA = \begin{pmatrix} 3 \times 2 & 3 \times 3 \\ 3 \times 4 & 3 \times 1 \end{pmatrix}$

$kA = \begin{pmatrix} 6 & 9 \\ 12 & 3 \end{pmatrix}$


The resulting matrix $kA$ is $\begin{pmatrix} 6 & 9 \\ 12 & 3 \end{pmatrix}$.

Question 6. If $A = \begin{pmatrix} 1 & -2 \\ 3 & 4 \end{pmatrix}$ and $B = \begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \end{pmatrix}$, find AB if it exists.

Answer:

For the product of two matrices $A$ and $B$ to be defined, the number of columns in matrix $A$ must be equal to the number of rows in matrix $B$.


Matrix $A$ has order $2 \times 2$ (2 rows and 2 columns).

Matrix $B$ has order $2 \times 3$ (2 rows and 3 columns).


The number of columns in $A$ is 2.

The number of rows in $B$ is 2.

Since the number of columns in $A$ (2) is equal to the number of rows in $B$ (2), the product $AB$ exists.


The order of the resulting matrix $AB$ will be (number of rows in $A$) $\times$ (number of columns in $B$), which is $2 \times 3$.


Let $C = AB$. The element $c_{ij}$ of the product matrix is obtained by multiplying the $i$-th row of $A$ by the $j$-th column of $B$ and summing the products.

$A = \begin{pmatrix} 1 & -2 \\ 3 & 4 \end{pmatrix}$, $B = \begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \end{pmatrix}$


Calculate the elements of $C = AB$:

$c_{11} = (1)(1) + (-2)(4) = 1 - 8 = -7$

$c_{12} = (1)(2) + (-2)(5) = 2 - 10 = -8$

$c_{13} = (1)(3) + (-2)(6) = 3 - 12 = -9$


$c_{21} = (3)(1) + (4)(4) = 3 + 16 = 19$

$c_{22} = (3)(2) + (4)(5) = 6 + 20 = 26$

$c_{23} = (3)(3) + (4)(6) = 9 + 24 = 33$


Therefore, the product matrix $AB$ is:

$AB = \begin{pmatrix} -7 & -8 & -9 \\ 19 & 26 & 33 \end{pmatrix}$

Question 7. Find the transpose of the matrix $A = \begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \end{pmatrix}$.

Answer:

The transpose of a matrix is obtained by interchanging its rows and columns.

If $A$ is an $m \times n$ matrix, its transpose, denoted by $A^T$ or $A'$, is an $n \times m$ matrix.


Given matrix $A = \begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \end{pmatrix}$.

Matrix $A$ has 2 rows and 3 columns, so its order is $2 \times 3$.


The transpose matrix $A^T$ will have 3 rows and 2 columns, so its order will be $3 \times 2$.

The first row of $A$ becomes the first column of $A^T$.

The second row of $A$ becomes the second column of $A^T$.


So, $A^T$ is obtained by writing the rows of $A$ as columns:

Row 1 of A: $\begin{pmatrix} 1 & 2 & 3 \end{pmatrix}$ becomes Column 1 of $A^T$: $\begin{pmatrix} 1 \\ 2 \\ 3 \end{pmatrix}$

Row 2 of A: $\begin{pmatrix} 4 & 5 & 6 \end{pmatrix}$ becomes Column 2 of $A^T$: $\begin{pmatrix} 4 \\ 5 \\ 6 \end{pmatrix}$


Therefore, the transpose of matrix A is:

$A^T = \begin{pmatrix} 1 & 4 \\ 2 & 5 \\ 3 & 6 \end{pmatrix}$

Question 8. If $A = \begin{pmatrix} 1 & 5 \\ 6 & 7 \end{pmatrix}$, verify that $(A')' = A$.

Answer:

Given matrix $A = \begin{pmatrix} 1 & 5 \\ 6 & 7 \end{pmatrix}$.


First, we find the transpose of matrix $A$, denoted by $A'$. To find the transpose, we interchange the rows and columns of $A$.

The first row of $A$ is $\begin{pmatrix} 1 & 5 \end{pmatrix}$, which becomes the first column of $A'$.

The second row of $A$ is $\begin{pmatrix} 6 & 7 \end{pmatrix}$, which becomes the second column of $A'$.


So, $A' = \begin{pmatrix} 1 & 6 \\ 5 & 7 \end{pmatrix}$.


Next, we find the transpose of $A'$, denoted by $(A')'$. We interchange the rows and columns of $A'$.

The first row of $A'$ is $\begin{pmatrix} 1 & 6 \end{pmatrix}$, which becomes the first column of $(A')'$.

The second row of $A'$ is $\begin{pmatrix} 5 & 7 \end{pmatrix}$, which becomes the second column of $(A')'$.


So, $(A')' = \begin{pmatrix} 1 & 5 \\ 6 & 7 \end{pmatrix}$.


Comparing $(A')'$ with the original matrix $A$, we see that:

$(A')' = \begin{pmatrix} 1 & 5 \\ 6 & 7 \end{pmatrix}$

$A = \begin{pmatrix} 1 & 5 \\ 6 & 7 \end{pmatrix}$


Since the corresponding elements are equal, we have $(A')' = A$.

Thus, the property $(A')' = A$ is verified for the given matrix $A$.

Question 9. If $A = \begin{pmatrix} 2 & 3 \\ 4 & 5 \end{pmatrix}$ and $B = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}$, verify that $(A+B)' = A' + B'$.

Answer:

Given matrices are $A = \begin{pmatrix} 2 & 3 \\ 4 & 5 \end{pmatrix}$ and $B = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}$.


We need to verify the property $(A+B)' = A' + B'$. We will evaluate both sides of the equation separately.


Left Hand Side (LHS): $(A+B)'$

First, find the sum $A+B$:

$A+B = \begin{pmatrix} 2 & 3 \\ 4 & 5 \end{pmatrix} + \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}$

$A+B = \begin{pmatrix} 2+1 & 3+2 \\ 4+3 & 5+4 \end{pmatrix}$

$A+B = \begin{pmatrix} 3 & 5 \\ 7 & 9 \end{pmatrix}$


Next, find the transpose of $(A+B)$:

$(A+B)' = \begin{pmatrix} 3 & 7 \\ 5 & 9 \end{pmatrix}$

This is the value of the LHS.


Right Hand Side (RHS): $A' + B'$

First, find the transpose of matrix $A$, $A'$:

$A' = \begin{pmatrix} 2 & 4 \\ 3 & 5 \end{pmatrix}$


Next, find the transpose of matrix $B$, $B'$:

$B' = \begin{pmatrix} 1 & 3 \\ 2 & 4 \end{pmatrix}$


Now, find the sum of $A'$ and $B'$:

$A' + B' = \begin{pmatrix} 2 & 4 \\ 3 & 5 \end{pmatrix} + \begin{pmatrix} 1 & 3 \\ 2 & 4 \end{pmatrix}$

$A' + B' = \begin{pmatrix} 2+1 & 4+3 \\ 3+2 & 5+4 \end{pmatrix}$

$A' + B' = \begin{pmatrix} 3 & 7 \\ 5 & 9 \end{pmatrix}$

This is the value of the RHS.


Comparing the LHS and RHS:

LHS = $(A+B)' = \begin{pmatrix} 3 & 7 \\ 5 & 9 \end{pmatrix}$

RHS = $A' + B' = \begin{pmatrix} 3 & 7 \\ 5 & 9 \end{pmatrix}$


Since LHS = RHS, the property $(A+B)' = A' + B'$ is verified for the given matrices $A$ and $B$.

Question 10. Define a symmetric matrix. Give an example of a $3 \times 3$ symmetric matrix.

Answer:

Definition of a Symmetric Matrix:

A square matrix $A$ is called a symmetric matrix if its transpose is equal to the matrix itself. That is, $A' = A$.

In terms of elements, a square matrix $A = [a_{ij}]$ of order $n \times n$ is symmetric if and only if $a_{ij} = a_{ji}$ for all possible values of $i$ and $j$ from 1 to $n$.


Example of a $3 \times 3$ Symmetric Matrix:

Let's construct a $3 \times 3$ matrix where the elements $a_{ij}$ are equal to $a_{ji}$.

Consider the matrix $A = \begin{pmatrix} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \end{pmatrix}$.

For $A$ to be symmetric, we must have $a_{12} = a_{21}$, $a_{13} = a_{31}$, and $a_{23} = a_{32}$. The diagonal elements ($a_{11}$, $a_{22}$, $a_{33}$) can be any values.


Let's choose some values satisfying these conditions:

$a_{11} = 1$, $a_{22} = 5$, $a_{33} = 9$

$a_{12} = 2$, $a_{21} = 2$

$a_{13} = 3$, $a_{31} = 3$

$a_{23} = 4$, $a_{32} = 4$


Using these values, the matrix is:

$A = \begin{pmatrix} 1 & 2 & 3 \\ 2 & 5 & 4 \\ 3 & 4 & 9 \end{pmatrix}$


Let's find the transpose of $A$, denoted by $A'$:

$A' = \begin{pmatrix} 1 & 2 & 3 \\ 2 & 5 & 4 \\ 3 & 4 & 9 \end{pmatrix}' = \begin{pmatrix} 1 & 2 & 3 \\ 2 & 5 & 4 \\ 3 & 4 & 9 \end{pmatrix}$


Since $A' = A$, the matrix $A = \begin{pmatrix} 1 & 2 & 3 \\ 2 & 5 & 4 \\ 3 & 4 & 9 \end{pmatrix}$ is a symmetric matrix.

Question 11. Define a skew-symmetric matrix. Give an example of a $2 \times 2$ skew-symmetric matrix.

Answer:

Definition of a Skew-Symmetric Matrix:

A square matrix $A$ is called a skew-symmetric matrix if its transpose is equal to the negative of the matrix itself. That is, $A' = -A$.

In terms of elements, a square matrix $A = [a_{ij}]$ of order $n \times n$ is skew-symmetric if and only if $a_{ij} = -a_{ji}$ for all possible values of $i$ and $j$ from 1 to $n$.

For the diagonal elements ($i=j$), the condition becomes $a_{ii} = -a_{ii}$, which implies $2a_{ii} = 0$, so $a_{ii} = 0$. Thus, all diagonal elements of a skew-symmetric matrix must be zero.


Example of a $2 \times 2$ Skew-Symmetric Matrix:

Let's construct a $2 \times 2$ matrix $A = \begin{pmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \end{pmatrix}$ such that $A' = -A$.

The conditions for skew-symmetry are:

$a_{11} = 0$

$a_{22} = 0$

$a_{12} = -a_{21}$


Let's choose a non-zero value for $a_{12}$, say $a_{12} = 5$.

Then, according to the condition $a_{12} = -a_{21}$, we have $5 = -a_{21}$, which means $a_{21} = -5$.


Using these values, we form the matrix:

$A = \begin{pmatrix} 0 & 5 \\ -5 & 0 \end{pmatrix}$


Now let's find the transpose of $A$, $A'$:

$A' = \begin{pmatrix} 0 & -5 \\ 5 & 0 \end{pmatrix}$


Next, let's find the negative of the matrix $A$, $-A$:

$-A = -1 \times \begin{pmatrix} 0 & 5 \\ -5 & 0 \end{pmatrix} = \begin{pmatrix} -1 \times 0 & -1 \times 5 \\ -1 \times (-5) & -1 \times 0 \end{pmatrix} = \begin{pmatrix} 0 & -5 \\ 5 & 0 \end{pmatrix}$


Comparing $A'$ and $-A$, we see that:

$A' = \begin{pmatrix} 0 & -5 \\ 5 & 0 \end{pmatrix}$

$-A = \begin{pmatrix} 0 & -5 \\ 5 & 0 \end{pmatrix}$


Since $A' = -A$, the matrix $A = \begin{pmatrix} 0 & 5 \\ -5 & 0 \end{pmatrix}$ is a skew-symmetric matrix.

Question 12. Express the matrix $A = \begin{pmatrix} 3 & 5 \\ 1 & -1 \end{pmatrix}$ as the sum of a symmetric and a skew-symmetric matrix.

Answer:

Any square matrix $A$ can be expressed as the sum of a symmetric matrix and a skew-symmetric matrix.

The decomposition is given by $A = S + K$, where $S$ is a symmetric matrix and $K$ is a skew-symmetric matrix.

The symmetric part $S$ is given by $S = \frac{1}{2}(A+A')$.

The skew-symmetric part $K$ is given by $K = \frac{1}{2}(A-A')$.


Given matrix $A = \begin{pmatrix} 3 & 5 \\ 1 & -1 \end{pmatrix}$.


First, find the transpose of $A$, $A'$:

$A' = \begin{pmatrix} 3 & 5 \\ 1 & -1 \end{pmatrix}' = \begin{pmatrix} 3 & 1 \\ 5 & -1 \end{pmatrix}$


Now, calculate $A+A'$:

$A+A' = \begin{pmatrix} 3 & 5 \\ 1 & -1 \end{pmatrix} + \begin{pmatrix} 3 & 1 \\ 5 & -1 \end{pmatrix} = \begin{pmatrix} 3+3 & 5+1 \\ 1+5 & -1+(-1) \end{pmatrix} = \begin{pmatrix} 6 & 6 \\ 6 & -2 \end{pmatrix}$


Calculate the symmetric part $S = \frac{1}{2}(A+A')$:

$S = \frac{1}{2} \begin{pmatrix} 6 & 6 \\ 6 & -2 \end{pmatrix} = \begin{pmatrix} \frac{1}{2} \times 6 & \frac{1}{2} \times 6 \\ \frac{1}{2} \times 6 & \frac{1}{2} \times (-2) \end{pmatrix} = \begin{pmatrix} 3 & 3 \\ 3 & -1 \end{pmatrix}$

To verify $S$ is symmetric, we find $S'$:

$S' = \begin{pmatrix} 3 & 3 \\ 3 & -1 \end{pmatrix}' = \begin{pmatrix} 3 & 3 \\ 3 & -1 \end{pmatrix} = S$. Thus, $S$ is symmetric.


Now, calculate $A-A'$:

$A-A' = \begin{pmatrix} 3 & 5 \\ 1 & -1 \end{pmatrix} - \begin{pmatrix} 3 & 1 \\ 5 & -1 \end{pmatrix} = \begin{pmatrix} 3-3 & 5-1 \\ 1-5 & -1-(-1) \end{pmatrix} = \begin{pmatrix} 0 & 4 \\ -4 & 0 \end{pmatrix}$


Calculate the skew-symmetric part $K = \frac{1}{2}(A-A')$:

$K = \frac{1}{2} \begin{pmatrix} 0 & 4 \\ -4 & 0 \end{pmatrix} = \begin{pmatrix} \frac{1}{2} \times 0 & \frac{1}{2} \times 4 \\ \frac{1}{2} \times (-4) & \frac{1}{2} \times 0 \end{pmatrix} = \begin{pmatrix} 0 & 2 \\ -2 & 0 \end{pmatrix}$

To verify $K$ is skew-symmetric, we find $K'$:

$K' = \begin{pmatrix} 0 & 2 \\ -2 & 0 \end{pmatrix}' = \begin{pmatrix} 0 & -2 \\ 2 & 0 \end{pmatrix}$

$-K = -\begin{pmatrix} 0 & 2 \\ -2 & 0 \end{pmatrix} = \begin{pmatrix} 0 & -2 \\ 2 & 0 \end{pmatrix}$. Thus, $K' = -K$, and $K$ is skew-symmetric.


Finally, express $A$ as the sum of $S$ and $K$:

$S + K = \begin{pmatrix} 3 & 3 \\ 3 & -1 \end{pmatrix} + \begin{pmatrix} 0 & 2 \\ -2 & 0 \end{pmatrix} = \begin{pmatrix} 3+0 & 3+2 \\ 3+(-2) & -1+0 \end{pmatrix} = \begin{pmatrix} 3 & 5 \\ 1 & -1 \end{pmatrix}$

This is equal to the original matrix $A$.


Thus, the matrix $A$ is expressed as the sum of a symmetric and a skew-symmetric matrix as:

$A = \underbrace{\begin{pmatrix} 3 & 3 \\ 3 & -1 \end{pmatrix}}_{\text{Symmetric}} + \underbrace{\begin{pmatrix} 0 & 2 \\ -2 & 0 \end{pmatrix}}_{\text{Skew-symmetric}}$

Question 13. If $A = \begin{pmatrix} \cos \alpha & -\sin \alpha \\ \sin \alpha & \cos \alpha \end{pmatrix}$, verify that $A'A = I$, where I is the identity matrix.

Answer:

Given:

Matrix $A = \begin{pmatrix} \cos \alpha & -\sin \alpha \\ \sin \alpha & \cos \alpha \end{pmatrix}$.

$I$ is the identity matrix.


To Verify:

$A'A = I$


Solution:

The given matrix $A$ is a $2 \times 2$ matrix.

The identity matrix of order $2 \times 2$ is $I = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$.


First, we find the transpose of matrix $A$, denoted by $A'$. To find the transpose, we interchange the rows and columns of $A$.

$A' = \begin{pmatrix} \cos \alpha & -\sin \alpha \\ \sin \alpha & \cos \alpha \end{pmatrix}' = \begin{pmatrix} \cos \alpha & \sin \alpha \\ -\sin \alpha & \cos \alpha \end{pmatrix}$


Next, we calculate the product $A'A$. Since $A'$ is $2 \times 2$ and $A$ is $2 \times 2$, the product $A'A$ will be a $2 \times 2$ matrix.

$A'A = \begin{pmatrix} \cos \alpha & \sin \alpha \\ -\sin \alpha & \cos \alpha \end{pmatrix} \begin{pmatrix} \cos \alpha & -\sin \alpha \\ \sin \alpha & \cos \alpha \end{pmatrix}$


Multiply the matrices by performing row-by-column multiplication:

Element (1,1) = (1st row of $A'$) $\cdot$ (1st column of $A$)

$= (\cos \alpha)(\cos \alpha) + (\sin \alpha)(\sin \alpha) = \cos^2 \alpha + \sin^2 \alpha$


Element (1,2) = (1st row of $A'$) $\cdot$ (2nd column of $A$)

$= (\cos \alpha)(-\sin \alpha) + (\sin \alpha)(\cos \alpha) = -\cos \alpha \sin \alpha + \sin \alpha \cos \alpha$


Element (2,1) = (2nd row of $A'$) $\cdot$ (1st column of $A$)

$= (-\sin \alpha)(\cos \alpha) + (\cos \alpha)(\sin \alpha) = -\sin \alpha \cos \alpha + \cos \alpha \sin \alpha$


Element (2,2) = (2nd row of $A'$) $\cdot$ (2nd column of $A$)

$= (-\sin \alpha)(-\sin \alpha) + (\cos \alpha)(\cos \alpha) = \sin^2 \alpha + \cos^2 \alpha$


Using the fundamental trigonometric identity $\sin^2 \theta + \cos^2 \theta = 1$, we have:

Element (1,1) = $1$

Element (1,2) = $0$

Element (2,1) = $0$

Element (2,2) = $1$


So, the product matrix $A'A$ is:

$A'A = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$


This result is the $2 \times 2$ identity matrix, $I = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$.


Therefore, we have shown that $A'A = I$.

The property is verified.

Question 14. If $A = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}$ and $B = \begin{pmatrix} 2 & 1 \\ 0 & -1 \end{pmatrix}$, verify that $(AB)' = B'A'$.

Answer:

Given:

$A = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}$ and $B = \begin{pmatrix} 2 & 1 \\ 0 & -1 \end{pmatrix}$.


To Verify:

$(AB)' = B'A'$


Solution:

We will evaluate both sides of the equation separately.


Left Hand Side (LHS): $(AB)'$

First, calculate the product $AB$. Since $A$ is $2 \times 2$ and $B$ is $2 \times 2$, the product $AB$ will be a $2 \times 2$ matrix.

$AB = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix} \begin{pmatrix} 2 & 1 \\ 0 & -1 \end{pmatrix}$

$AB = \begin{pmatrix} (1)(2) + (2)(0) & (1)(1) + (2)(-1) \\ (3)(2) + (4)(0) & (3)(1) + (4)(-1) \end{pmatrix}$

$AB = \begin{pmatrix} 2 + 0 & 1 - 2 \\ 6 + 0 & 3 - 4 \end{pmatrix}$

$AB = \begin{pmatrix} 2 & -1 \\ 6 & -1 \end{pmatrix}$


Now, find the transpose of $AB$, $(AB)'$. To find the transpose, interchange the rows and columns of $AB$.

$(AB)' = \begin{pmatrix} 2 & -1 \\ 6 & -1 \end{pmatrix}' = \begin{pmatrix} 2 & 6 \\ -1 & -1 \end{pmatrix}$

This is the value of the LHS.


Right Hand Side (RHS): $B'A'$

First, find the transpose of matrix $A$, $A'$.

$A' = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}' = \begin{pmatrix} 1 & 3 \\ 2 & 4 \end{pmatrix}$


Next, find the transpose of matrix $B$, $B'$.

$B' = \begin{pmatrix} 2 & 1 \\ 0 & -1 \end{pmatrix}' = \begin{pmatrix} 2 & 0 \\ 1 & -1 \end{pmatrix}$


Now, calculate the product $B'A'$. Since $B'$ is $2 \times 2$ and $A'$ is $2 \times 2$, the product $B'A'$ will be a $2 \times 2$ matrix.

$B'A' = \begin{pmatrix} 2 & 0 \\ 1 & -1 \end{pmatrix} \begin{pmatrix} 1 & 3 \\ 2 & 4 \end{pmatrix}$

$B'A' = \begin{pmatrix} (2)(1) + (0)(2) & (2)(3) + (0)(4) \\ (1)(1) + (-1)(2) & (1)(3) + (-1)(4) \end{pmatrix}$

$B'A' = \begin{pmatrix} 2 + 0 & 6 + 0 \\ 1 - 2 & 3 - 4 \end{pmatrix}$

$B'A' = \begin{pmatrix} 2 & 6 \\ -1 & -1 \end{pmatrix}$

This is the value of the RHS.


Verification:

Comparing the LHS and RHS:

LHS = $(AB)' = \begin{pmatrix} 2 & 6 \\ -1 & -1 \end{pmatrix}$

RHS = $B'A' = \begin{pmatrix} 2 & 6 \\ -1 & -1 \end{pmatrix}$


Since LHS = RHS, the property $(AB)' = B'A'$ is verified for the given matrices $A$ and $B$.

Question 15. If $A$ is a square matrix, prove that $A+A'$ is a symmetric matrix and $A-A'$ is a skew-symmetric matrix.

Answer:

Given:

$A$ is a square matrix.


To Prove:

1. $A+A'$ is a symmetric matrix.

2. $A-A'$ is a skew-symmetric matrix.


Proof:

A matrix $S$ is symmetric if its transpose $S'$ is equal to $S$ ($S' = S$).

A matrix $K$ is skew-symmetric if its transpose $K'$ is equal to the negative of $K$ ($K' = -K$).

We will use the following properties of matrix transpose for square matrices $A$ and $B$, and scalar $c$:

$(A+B)' = A' + B'$

$(A')' = A$

$(cA)' = cA'$


Part 1: Prove that $A+A'$ is a symmetric matrix.

Let $S = A+A'$. To prove that $S$ is symmetric, we need to show that $S' = S$.

Consider the transpose of $S$:

$S' = (A+A')'$

Using the property $(A+B)' = A' + B'$, we get:

$S' = A' + (A')'$

Using the property $(A')' = A$, we get:

$S' = A' + A$

Since matrix addition is commutative, $A' + A = A + A'$.

So, $S' = A + A'$

By definition, $S = A+A'$. Therefore, $S' = S$.

Hence, $A+A'$ is a symmetric matrix.


Part 2: Prove that $A-A'$ is a skew-symmetric matrix.

Let $K = A-A'$. To prove that $K$ is skew-symmetric, we need to show that $K' = -K$.

Consider the transpose of $K$:

$K' = (A-A')'$

Using the property $(A-B)' = A' - B'$, which is a consequence of $(A+B)'=A'+B'$ and $(cA)'=cA'$ with $c=-1$, we get:

$K' = A' - (A')'$

Using the property $(A')' = A$, we get:

$K' = A' - A$


Now consider $-K$:

$-K = -(A-A')$

Multiplying by -1:

$-K = -A + A'$

Rearranging the terms:

$-K = A' - A$


Comparing $K'$ and $-K$, we see that $K' = A' - A$ and $-K = A' - A$.

Therefore, $K' = -K$.

Hence, $A-A'$ is a skew-symmetric matrix.


Both parts of the proof are complete.

Question 16. Find the product of matrices $A = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}$ and $B = \begin{pmatrix} 5 & 6 \\ 7 & 8 \end{pmatrix}$.

Answer:

Given matrices $A = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}$ and $B = \begin{pmatrix} 5 & 6 \\ 7 & 8 \end{pmatrix}$.

Both matrices are of order $2 \times 2$. Since the number of columns in $A$ (2) is equal to the number of rows in $B$ (2), the product $AB$ is defined and the resulting matrix will be of order $2 \times 2$.


Let $C = AB$. The element $c_{ij}$ of the product matrix is obtained by multiplying the $i$-th row of $A$ by the $j$-th column of $B$ and summing the products.

$A = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}$, $B = \begin{pmatrix} 5 & 6 \\ 7 & 8 \end{pmatrix}$


Calculate the elements of $C = AB$:

$c_{11}$ (1st row of $A$ $\times$ 1st column of $B$) $= (1)(5) + (2)(7) = 5 + 14 = 19$

$c_{12}$ (1st row of $A$ $\times$ 2nd column of $B$) $= (1)(6) + (2)(8) = 6 + 16 = 22$


$c_{21}$ (2nd row of $A$ $\times$ 1st column of $B$) $= (3)(5) + (4)(7) = 15 + 28 = 43$

$c_{22}$ (2nd row of $A$ $\times$ 2nd column of $B$) $= (3)(6) + (4)(8) = 18 + 32 = 50$


Therefore, the product matrix $AB$ is:

$AB = \begin{pmatrix} 19 & 22 \\ 43 & 50 \end{pmatrix}$

Question 17. If $A = \begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix}$, show that $A^2 = -I$, where I is the $2 \times 2$ identity matrix.

Answer:

Given:

Matrix $A = \begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix}$.

$I$ is the $2 \times 2$ identity matrix, $I = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$.


To Show:

$A^2 = -I$


Solution:

First, we calculate $A^2$, which is the product of matrix $A$ with itself: $A^2 = A \times A$.

$A^2 = \begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix}$


Perform the matrix multiplication:

Element in the 1st row, 1st column: $(0)(0) + (-1)(1) = 0 - 1 = -1$

Element in the 1st row, 2nd column: $(0)(-1) + (-1)(0) = 0 + 0 = 0$

Element in the 2nd row, 1st column: $(1)(0) + (0)(1) = 0 + 0 = 0$

Element in the 2nd row, 2nd column: $(1)(-1) + (0)(0) = -1 + 0 = -1$


So, the resulting matrix $A^2$ is:

$A^2 = \begin{pmatrix} -1 & 0 \\ 0 & -1 \end{pmatrix}$


Now, we calculate $-I$ using the given identity matrix $I = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$.

$-I = -1 \times I = -1 \times \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} = \begin{pmatrix} -1 \times 1 & -1 \times 0 \\ -1 \times 0 & -1 \times 1 \end{pmatrix}$

$-I = \begin{pmatrix} -1 & 0 \\ 0 & -1 \end{pmatrix}$


Comparing the result of $A^2$ with the result of $-I$, we see that:

$A^2 = \begin{pmatrix} -1 & 0 \\ 0 & -1 \end{pmatrix}$

$-I = \begin{pmatrix} -1 & 0 \\ 0 & -1 \end{pmatrix}$


Since the corresponding elements are equal, we have $A^2 = -I$.

This shows that $A^2$ is indeed equal to $-I$ for the given matrix $A$.

Question 18. Find the elementary row operation that transforms the matrix $\begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}$ into $\begin{pmatrix} 1 & 2 \\ 0 & -2 \end{pmatrix}$.

Answer:

Let the original matrix be $A = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}$ and the transformed matrix be $B = \begin{pmatrix} 1 & 2 \\ 0 & -2 \end{pmatrix}$.


We compare the rows of matrix $A$ with the rows of matrix $B$.

The first row of $A$ is $\begin{pmatrix} 1 & 2 \end{pmatrix}$, which is the same as the first row of $B$. So, the first row has not been changed.

The second row of $A$ is $\begin{pmatrix} 3 & 4 \end{pmatrix}$ and the second row of $B$ is $\begin{pmatrix} 0 & -2 \end{pmatrix}$. The second row has been changed.


Elementary row operations are operations performed on the rows of a matrix. They include:

1. Swapping two rows.

2. Multiplying a row by a non-zero scalar.

3. Adding a multiple of one row to another row.


Since the first row is unchanged, and the second row is different, the operation must involve the second row, possibly using the first row.

Let $R_1 = \begin{pmatrix} 1 & 2 \end{pmatrix}$ and $R_2 = \begin{pmatrix} 3 & 4 \end{pmatrix}$ be the rows of matrix $A$.

Let $R'_1 = \begin{pmatrix} 1 & 2 \end{pmatrix}$ and $R'_2 = \begin{pmatrix} 0 & -2 \end{pmatrix}$ be the rows of matrix $B$.

We see that $R'_1 = R_1$.

We look for an operation of the form $R_2 \to R_2 + cR_1$ that transforms $R_2$ into $R'_2$.

We want $R_2 + cR_1 = R'_2$.

$\begin{pmatrix} 3 & 4 \end{pmatrix} + c \begin{pmatrix} 1 & 2 \end{pmatrix} = \begin{pmatrix} 0 & -2 \end{pmatrix}$

$\begin{pmatrix} 3 + c & 4 + 2c \end{pmatrix} = \begin{pmatrix} 0 & -2 \end{pmatrix}$


Equating the corresponding elements, we get a system of equations:

$3 + c = 0$

$4 + 2c = -2$


From the first equation, $c = -3$.

Let's check if this value of $c$ satisfies the second equation:

$4 + 2(-3) = 4 - 6 = -2$. This matches the second element of $R'_2$.


Thus, the elementary row operation is to replace the second row ($R_2$) with the second row minus 3 times the first row ($R_1$).

This operation is denoted as $R_2 \to R_2 - 3R_1$.

Question 19. Define an invertible matrix. If $A = \begin{pmatrix} 2 & 1 \\ 1 & 1 \end{pmatrix}$ and $B = \begin{pmatrix} 1 & -1 \\ -1 & 2 \end{pmatrix}$, show that B is the inverse of A.

Answer:

Definition of an Invertible Matrix:

A square matrix $A$ of order $n \times n$ is said to be invertible if there exists another square matrix $B$ of the same order $n \times n$ such that $AB = BA = I$, where $I$ is the identity matrix of order $n \times n$.

If such a matrix $B$ exists, it is called the inverse of $A$ and is denoted by $A^{-1}$. The inverse of a matrix, if it exists, is unique.


Given:

Matrix $A = \begin{pmatrix} 2 & 1 \\ 1 & 1 \end{pmatrix}$ and matrix $B = \begin{pmatrix} 1 & -1 \\ -1 & 2 \end{pmatrix}$.

The identity matrix of order $2 \times 2$ is $I = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$.


To Show:

$B$ is the inverse of $A$. This means we need to show that $AB = I$ and $BA = I$.


Solution:

First, calculate the product $AB$. Since both $A$ and $B$ are $2 \times 2$ matrices, their product will be a $2 \times 2$ matrix.

$AB = \begin{pmatrix} 2 & 1 \\ 1 & 1 \end{pmatrix} \begin{pmatrix} 1 & -1 \\ -1 & 2 \end{pmatrix}$

$AB = \begin{pmatrix} (2)(1) + (1)(-1) & (2)(-1) + (1)(2) \\ (1)(1) + (1)(-1) & (1)(-1) + (1)(2) \end{pmatrix}$

$AB = \begin{pmatrix} 2 - 1 & -2 + 2 \\ 1 - 1 & -1 + 2 \end{pmatrix}$

$AB = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$

This result is the identity matrix $I$. So, $AB = I$.


Next, calculate the product $BA$. Since both $B$ and $A$ are $2 \times 2$ matrices, their product will be a $2 \times 2$ matrix.

$BA = \begin{pmatrix} 1 & -1 \\ -1 & 2 \end{pmatrix} \begin{pmatrix} 2 & 1 \\ 1 & 1 \end{pmatrix}$

$BA = \begin{pmatrix} (1)(2) + (-1)(1) & (1)(1) + (-1)(1) \\ (-1)(2) + (2)(1) & (-1)(1) + (2)(1) \end{pmatrix}$

$BA = \begin{pmatrix} 2 - 1 & 1 - 1 \\ -2 + 2 & -1 + 2 \end{pmatrix}$

$BA = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$

This result is also the identity matrix $I$. So, $BA = I$.


Since we have shown that $AB = I$ and $BA = I$, by the definition of an invertible matrix, matrix $B$ is the inverse of matrix $A$.

Thus, it is shown that $B$ is the inverse of $A$.

Question 20. For the matrix $A = \begin{pmatrix} 1 & 2 \\ 3 & -5 \end{pmatrix}$, show that $A^2 + 4A - 11I = O$, where O is the zero matrix.

Answer:

Given:

Matrix $A = \begin{pmatrix} 1 & 2 \\ 3 & -5 \end{pmatrix}$.

$I$ is the $2 \times 2$ identity matrix, $I = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$.

$O$ is the $2 \times 2$ zero matrix, $O = \begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}$.


To Show:

$A^2 + 4A - 11I = O$.


Solution:

We need to calculate the expression $A^2 + 4A - 11I$ and show that it equals the zero matrix $O$.


First, we calculate $A^2 = A \times A$:

$A^2 = \begin{pmatrix} 1 & 2 \\ 3 & -5 \end{pmatrix} \begin{pmatrix} 1 & 2 \\ 3 & -5 \end{pmatrix}$

$A^2 = \begin{pmatrix} (1)(1)+(2)(3) & (1)(2)+(2)(-5) \\ (3)(1)+(-5)(3) & (3)(2)+(-5)(-5) \end{pmatrix}$

$A^2 = \begin{pmatrix} 1+6 & 2-10 \\ 3-15 & 6+25 \end{pmatrix}$

$A^2 = \begin{pmatrix} 7 & -8 \\ -12 & 31 \end{pmatrix}$


Next, we calculate $4A$:

$4A = 4 \times \begin{pmatrix} 1 & 2 \\ 3 & -5 \end{pmatrix}$

$4A = \begin{pmatrix} 4 \times 1 & 4 \times 2 \\ 4 \times 3 & 4 \times (-5) \end{pmatrix}$

$4A = \begin{pmatrix} 4 & 8 \\ 12 & -20 \end{pmatrix}$


Next, we calculate $11I$:

$11I = 11 \times \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$

$11I = \begin{pmatrix} 11 \times 1 & 11 \times 0 \\ 11 \times 0 & 11 \times 1 \end{pmatrix}$

$11I = \begin{pmatrix} 11 & 0 \\ 0 & 11 \end{pmatrix}$


Now, we calculate $A^2 + 4A - 11I$:

$A^2 + 4A - 11I = \begin{pmatrix} 7 & -8 \\ -12 & 31 \end{pmatrix} + \begin{pmatrix} 4 & 8 \\ 12 & -20 \end{pmatrix} - \begin{pmatrix} 11 & 0 \\ 0 & 11 \end{pmatrix}$


First, perform the addition $A^2 + 4A$:

$A^2 + 4A = \begin{pmatrix} 7+4 & -8+8 \\ -12+12 & 31+(-20) \end{pmatrix}$

$A^2 + 4A = \begin{pmatrix} 11 & 0 \\ 0 & 11 \end{pmatrix}$


Now, perform the subtraction $(A^2 + 4A) - 11I$:

$(A^2 + 4A) - 11I = \begin{pmatrix} 11 & 0 \\ 0 & 11 \end{pmatrix} - \begin{pmatrix} 11 & 0 \\ 0 & 11 \end{pmatrix}$

$(A^2 + 4A) - 11I = \begin{pmatrix} 11-11 & 0-0 \\ 0-0 & 11-11 \end{pmatrix}$

$(A^2 + 4A) - 11I = \begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}$


The resulting matrix is the $2 \times 2$ zero matrix $O = \begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}$.

Therefore, we have shown that $A^2 + 4A - 11I = O$.

The identity is shown to be true for the given matrix $A$.

Question 21. If $A = \begin{pmatrix} \alpha & \beta \\ \gamma & -\alpha \end{pmatrix}$ is such that $A^2 = I$, then find the relationship between $\alpha, \beta, \gamma$.

Answer:

Given:

Matrix $A = \begin{pmatrix} \alpha & \beta \\ \gamma & -\alpha \end{pmatrix}$ and $A^2 = I$, where $I = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$.


To Find:

The relationship between $\alpha, \beta$, and $\gamma$.


Solution:

First, we calculate $A^2 = A \times A$:

$A^2 = \begin{pmatrix} \alpha & \beta \\ \gamma & -\alpha \end{pmatrix} \begin{pmatrix} \alpha & \beta \\ \gamma & -\alpha \end{pmatrix}$

$A^2 = \begin{pmatrix} (\alpha)(\alpha)+(\beta)(\gamma) & (\alpha)(\beta)+(\beta)(-\alpha) \\ (\gamma)(\alpha)+(-\alpha)(\gamma) & (\gamma)(\beta)+(-\alpha)(-\alpha) \end{pmatrix}$

$A^2 = \begin{pmatrix} \alpha^2 + \beta\gamma & \alpha\beta - \alpha\beta \\ \alpha\gamma - \alpha\gamma & \beta\gamma + \alpha^2 \end{pmatrix}$

$A^2 = \begin{pmatrix} \alpha^2 + \beta\gamma & 0 \\ 0 & \alpha^2 + \beta\gamma \end{pmatrix}$


We are given that $A^2 = I$. So, we equate the calculated $A^2$ with the identity matrix $I$:

$\begin{pmatrix} \alpha^2 + \beta\gamma & 0 \\ 0 & \alpha^2 + \beta\gamma \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$


Equating the corresponding elements, we get:

$\alpha^2 + \beta\gamma = 1$

$0 = 0$

$0 = 0$

$\alpha^2 + \beta\gamma = 1$


Both relevant equations are the same: $\alpha^2 + \beta\gamma = 1$.


Therefore, the relationship between $\alpha$, $\beta$, and $\gamma$ is:

$\alpha^2 + \beta\gamma = 1$

Question 22. If $A = \begin{pmatrix} 2 & 3 \\ -1 & 2 \end{pmatrix}$, find $A^2 - 4A + 7I$.

Answer:

Given:

Matrix $A = \begin{pmatrix} 2 & 3 \\ -1 & 2 \end{pmatrix}$.

We are asked to find $A^2 - 4A + 7I$. Since $A$ is a $2 \times 2$ matrix, $I$ is the $2 \times 2$ identity matrix, $I = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$.


To Find:

$A^2 - 4A + 7I$.


Solution:

First, we calculate $A^2 = A \times A$:

$A^2 = \begin{pmatrix} 2 & 3 \\ -1 & 2 \end{pmatrix} \begin{pmatrix} 2 & 3 \\ -1 & 2 \end{pmatrix}$

$A^2 = \begin{pmatrix} (2)(2)+(3)(-1) & (2)(3)+(3)(2) \\ (-1)(2)+(2)(-1) & (-1)(3)+(2)(2) \end{pmatrix}$

$A^2 = \begin{pmatrix} 4-3 & 6+6 \\ -2-2 & -3+4 \end{pmatrix}$

$A^2 = \begin{pmatrix} 1 & 12 \\ -4 & 1 \end{pmatrix}$


Next, we calculate $4A$:

$4A = 4 \times \begin{pmatrix} 2 & 3 \\ -1 & 2 \end{pmatrix}$

$4A = \begin{pmatrix} 4 \times 2 & 4 \times 3 \\ 4 \times (-1) & 4 \times 2 \end{pmatrix}$

$4A = \begin{pmatrix} 8 & 12 \\ -4 & 8 \end{pmatrix}$


Next, we calculate $7I$:

$7I = 7 \times \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$

$7I = \begin{pmatrix} 7 \times 1 & 7 \times 0 \\ 7 \times 0 & 7 \times 1 \end{pmatrix}$

$7I = \begin{pmatrix} 7 & 0 \\ 0 & 7 \end{pmatrix}$


Now, we calculate $A^2 - 4A + 7I$:

$A^2 - 4A + 7I = \begin{pmatrix} 1 & 12 \\ -4 & 1 \end{pmatrix} - \begin{pmatrix} 8 & 12 \\ -4 & 8 \end{pmatrix} + \begin{pmatrix} 7 & 0 \\ 0 & 7 \end{pmatrix}$


Perform the subtraction $A^2 - 4A$:

$A^2 - 4A = \begin{pmatrix} 1-8 & 12-12 \\ -4-(-4) & 1-8 \end{pmatrix}$

$A^2 - 4A = \begin{pmatrix} -7 & 0 \\ 0 & -7 \end{pmatrix}$


Now, perform the addition $(A^2 - 4A) + 7I$:

$(A^2 - 4A) + 7I = \begin{pmatrix} -7 & 0 \\ 0 & -7 \end{pmatrix} + \begin{pmatrix} 7 & 0 \\ 0 & 7 \end{pmatrix}$

$(A^2 - 4A) + 7I = \begin{pmatrix} -7+7 & 0+0 \\ 0+0 & -7+7 \end{pmatrix}$

$(A^2 - 4A) + 7I = \begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}$


The result of the expression $A^2 - 4A + 7I$ is the zero matrix:

$A^2 - 4A + 7I = \begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}$



Long Answer Type Questions

Question 1. If $A = \begin{pmatrix} 1 & -1 & 2 \\ 0 & 3 & 4 \end{pmatrix}$ and $B = \begin{pmatrix} -1 & 0 \\ 2 & 1 \\ 3 & -2 \end{pmatrix}$, verify that $(AB)' = B'A'$.

Answer:

Given:

$A = \begin{pmatrix} 1 & -1 & 2 \\ 0 & 3 & 4 \end{pmatrix}$ and $B = \begin{pmatrix} -1 & 0 \\ 2 & 1 \\ 3 & -2 \end{pmatrix}$.


To Verify:

$(AB)' = B'A'$


Solution:

We will evaluate both sides of the equation separately.


Left Hand Side (LHS): $(AB)'$

First, calculate the product $AB$. Matrix $A$ has order $2 \times 3$ and matrix $B$ has order $3 \times 2$. The number of columns in $A$ (3) is equal to the number of rows in $B$ (3), so the product $AB$ is defined. The resulting matrix will be of order $2 \times 2$.

$AB = \begin{pmatrix} 1 & -1 & 2 \\ 0 & 3 & 4 \end{pmatrix} \begin{pmatrix} -1 & 0 \\ 2 & 1 \\ 3 & -2 \end{pmatrix}$

$AB = \begin{pmatrix} (1)(-1)+(-1)(2)+(2)(3) & (1)(0)+(-1)(1)+(2)(-2) \\ (0)(-1)+(3)(2)+(4)(3) & (0)(0)+(3)(1)+(4)(-2) \end{pmatrix}$

$AB = \begin{pmatrix} -1-2+6 & 0-1-4 \\ 0+6+12 & 0+3-8 \end{pmatrix}$

$AB = \begin{pmatrix} 3 & -5 \\ 18 & -5 \end{pmatrix}$


Now, find the transpose of $AB$, $(AB)'$. To find the transpose, interchange the rows and columns of $AB$.

$(AB)' = \begin{pmatrix} 3 & -5 \\ 18 & -5 \end{pmatrix}' = \begin{pmatrix} 3 & 18 \\ -5 & -5 \end{pmatrix}$

This is the value of the LHS.


Right Hand Side (RHS): $B'A'$

First, find the transpose of matrix $A$, $A'$. Matrix $A$ is $2 \times 3$, so $A'$ will be $3 \times 2$.

$A' = \begin{pmatrix} 1 & -1 & 2 \\ 0 & 3 & 4 \end{pmatrix}' = \begin{pmatrix} 1 & 0 \\ -1 & 3 \\ 2 & 4 \end{pmatrix}$


Next, find the transpose of matrix $B$, $B'$. Matrix $B$ is $3 \times 2$, so $B'$ will be $2 \times 3$.

$B' = \begin{pmatrix} -1 & 0 \\ 2 & 1 \\ 3 & -2 \end{pmatrix}' = \begin{pmatrix} -1 & 2 & 3 \\ 0 & 1 & -2 \end{pmatrix}$


Now, calculate the product $B'A'$. Matrix $B'$ is $2 \times 3$ and matrix $A'$ is $3 \times 2$. The number of columns in $B'$ (3) is equal to the number of rows in $A'$ (3), so the product $B'A'$ is defined. The resulting matrix will be of order $2 \times 2$.

$B'A' = \begin{pmatrix} -1 & 2 & 3 \\ 0 & 1 & -2 \end{pmatrix} \begin{pmatrix} 1 & 0 \\ -1 & 3 \\ 2 & 4 \end{pmatrix}$

$B'A' = \begin{pmatrix} (-1)(1)+(2)(-1)+(3)(2) & (-1)(0)+(2)(3)+(3)(4) \\ (0)(1)+(1)(-1)+(-2)(2) & (0)(0)+(1)(3)+(-2)(4) \end{pmatrix}$

$B'A' = \begin{pmatrix} -1-2+6 & 0+6+12 \\ 0-1-4 & 0+3-8 \end{pmatrix}$

$B'A' = \begin{pmatrix} 3 & 18 \\ -5 & -5 \end{pmatrix}$

This is the value of the RHS.


Verification:

Comparing the LHS and RHS:

LHS = $(AB)' = \begin{pmatrix} 3 & 18 \\ -5 & -5 \end{pmatrix}$

RHS = $B'A' = \begin{pmatrix} 3 & 18 \\ -5 & -5 \end{pmatrix}$


Since LHS = RHS, the property $(AB)' = B'A'$ is verified for the given matrices $A$ and $B$.

Question 2. Express the matrix $A = \begin{pmatrix} 6 & -2 & 2 \\ -2 & 3 & -1 \\ 2 & -1 & 3 \end{pmatrix}$ as the sum of a symmetric and a skew-symmetric matrix.

Answer:

Any square matrix $A$ can be uniquely expressed as the sum of a symmetric matrix and a skew-symmetric matrix.

The decomposition is given by $A = S + K$, where $S$ is a symmetric matrix and $K$ is a skew-symmetric matrix.

The symmetric part $S$ is given by $S = \frac{1}{2}(A+A')$.

The skew-symmetric part $K$ is given by $K = \frac{1}{2}(A-A')$.


Given:

Matrix $A = \begin{pmatrix} 6 & -2 & 2 \\ -2 & 3 & -1 \\ 2 & -1 & 3 \end{pmatrix}$.


To Express:

$A$ as the sum of a symmetric and a skew-symmetric matrix.


Solution:

First, find the transpose of $A$, $A'$:

$A' = \begin{pmatrix} 6 & -2 & 2 \\ -2 & 3 & -1 \\ 2 & -1 & 3 \end{pmatrix}' = \begin{pmatrix} 6 & -2 & 2 \\ -2 & 3 & -1 \\ 2 & -1 & 3 \end{pmatrix}$


Observe that $A' = A$. This means the given matrix $A$ is already a symmetric matrix.


Now, calculate the symmetric part $S = \frac{1}{2}(A+A')$:

$A+A' = \begin{pmatrix} 6 & -2 & 2 \\ -2 & 3 & -1 \\ 2 & -1 & 3 \end{pmatrix} + \begin{pmatrix} 6 & -2 & 2 \\ -2 & 3 & -1 \\ 2 & -1 & 3 \end{pmatrix} = \begin{pmatrix} 6+6 & -2-2 & 2+2 \\ -2-2 & 3+3 & -1-1 \\ 2+2 & -1-1 & 3+3 \end{pmatrix} = \begin{pmatrix} 12 & -4 & 4 \\ -4 & 6 & -2 \\ 4 & -2 & 6 \end{pmatrix}$

$S = \frac{1}{2}(A+A') = \frac{1}{2} \begin{pmatrix} 12 & -4 & 4 \\ -4 & 6 & -2 \\ 4 & -2 & 6 \end{pmatrix} = \begin{pmatrix} 6 & -2 & 2 \\ -2 & 3 & -1 \\ 2 & -1 & 3 \end{pmatrix}$

We can verify that $S$ is symmetric by checking $S' = S$. Indeed, $S' = \begin{pmatrix} 6 & -2 & 2 \\ -2 & 3 & -1 \\ 2 & -1 & 3 \end{pmatrix}' = \begin{pmatrix} 6 & -2 & 2 \\ -2 & 3 & -1 \\ 2 & -1 & 3 \end{pmatrix} = S$.


Next, calculate the skew-symmetric part $K = \frac{1}{2}(A-A')$:

$A-A' = \begin{pmatrix} 6 & -2 & 2 \\ -2 & 3 & -1 \\ 2 & -1 & 3 \end{pmatrix} - \begin{pmatrix} 6 & -2 & 2 \\ -2 & 3 & -1 \\ 2 & -1 & 3 \end{pmatrix} = \begin{pmatrix} 6-6 & -2-(-2) & 2-2 \\ -2-(-2) & 3-3 & -1-(-1) \\ 2-2 & -1-(-1) & 3-3 \end{pmatrix} = \begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}$

$K = \frac{1}{2}(A-A') = \frac{1}{2} \begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix} = \begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}$

We can verify that $K$ is skew-symmetric by checking $K' = -K$. Indeed, $K' = \begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}' = \begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}$ and $-K = - \begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix} = \begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}$. So $K' = -K$.


Finally, express $A$ as the sum of $S$ and $K$:

$A = S + K = \begin{pmatrix} 6 & -2 & 2 \\ -2 & 3 & -1 \\ 2 & -1 & 3 \end{pmatrix} + \begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}$


The matrix $A$ expressed as the sum of a symmetric and a skew-symmetric matrix is:

$A = \underbrace{\begin{pmatrix} 6 & -2 & 2 \\ -2 & 3 & -1 \\ 2 & -1 & 3 \end{pmatrix}}_{\text{Symmetric}} + \underbrace{\begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}}_{\text{Skew-symmetric}}$

Question 3. If $A = \begin{pmatrix} 1 & 0 & 2 \\ 0 & 2 & 1 \\ 2 & 0 & 3 \end{pmatrix}$, prove that $A^3 - 6A^2 + 7A + 2I = O$, where I is the identity matrix and O is the zero matrix.

Answer:

Given:

Matrix $A = \begin{pmatrix} 1 & 0 & 2 \\ 0 & 2 & 1 \\ 2 & 0 & 3 \end{pmatrix}$.

$I$ is the $3 \times 3$ identity matrix, $I = \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}$.

$O$ is the $3 \times 3$ zero matrix, $O = \begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}$.


To Prove:

$A^3 - 6A^2 + 7A + 2I = O$.


Proof:

We will evaluate the expression $A^3 - 6A^2 + 7A + 2I$.


First, calculate $A^2 = A \times A$:

$A^2 = \begin{pmatrix} 1 & 0 & 2 \\ 0 & 2 & 1 \\ 2 & 0 & 3 \end{pmatrix} \begin{pmatrix} 1 & 0 & 2 \\ 0 & 2 & 1 \\ 2 & 0 & 3 \end{pmatrix}$

$A^2 = \begin{pmatrix} (1)(1)+(0)(0)+(2)(2) & (1)(0)+(0)(2)+(2)(0) & (1)(2)+(0)(1)+(2)(3) \\ (0)(1)+(2)(0)+(1)(2) & (0)(0)+(2)(2)+(1)(0) & (0)(2)+(2)(1)+(1)(3) \\ (2)(1)+(0)(0)+(3)(2) & (2)(0)+(0)(2)+(3)(0) & (2)(2)+(0)(1)+(3)(3) \end{pmatrix}$

$A^2 = \begin{pmatrix} 1+0+4 & 0+0+0 & 2+0+6 \\ 0+0+2 & 0+4+0 & 0+2+3 \\ 2+0+6 & 0+0+0 & 4+0+9 \end{pmatrix}$

$A^2 = \begin{pmatrix} 5 & 0 & 8 \\ 2 & 4 & 5 \\ 8 & 0 & 13 \end{pmatrix}$


Next, calculate $A^3 = A^2 \times A$:

$A^3 = \begin{pmatrix} 5 & 0 & 8 \\ 2 & 4 & 5 \\ 8 & 0 & 13 \end{pmatrix} \begin{pmatrix} 1 & 0 & 2 \\ 0 & 2 & 1 \\ 2 & 0 & 3 \end{pmatrix}$

$A^3 = \begin{pmatrix} (5)(1)+(0)(0)+(8)(2) & (5)(0)+(0)(2)+(8)(0) & (5)(2)+(0)(1)+(8)(3) \\ (2)(1)+(4)(0)+(5)(2) & (2)(0)+(4)(2)+(5)(0) & (2)(2)+(4)(1)+(5)(3) \\ (8)(1)+(0)(0)+(13)(2) & (8)(0)+(0)(2)+(13)(0) & (8)(2)+(0)(1)+(13)(3) \end{pmatrix}$

$A^3 = \begin{pmatrix} 5+0+16 & 0+0+0 & 10+0+24 \\ 2+0+10 & 0+8+0 & 4+4+15 \\ 8+0+26 & 0+0+0 & 16+0+39 \end{pmatrix}$

$A^3 = \begin{pmatrix} 21 & 0 & 34 \\ 12 & 8 & 23 \\ 34 & 0 & 55 \end{pmatrix}$


Now, calculate the scalar multiples:

$6A^2 = 6 \begin{pmatrix} 5 & 0 & 8 \\ 2 & 4 & 5 \\ 8 & 0 & 13 \end{pmatrix} = \begin{pmatrix} 30 & 0 & 48 \\ 12 & 24 & 30 \\ 48 & 0 & 78 \end{pmatrix}$


$7A = 7 \begin{pmatrix} 1 & 0 & 2 \\ 0 & 2 & 1 \\ 2 & 0 & 3 \end{pmatrix} = \begin{pmatrix} 7 & 0 & 14 \\ 0 & 14 & 7 \\ 14 & 0 & 21 \end{pmatrix}$


$2I = 2 \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix} = \begin{pmatrix} 2 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 2 \end{pmatrix}$


Finally, calculate $A^3 - 6A^2 + 7A + 2I$:

$A^3 - 6A^2 + 7A + 2I = \begin{pmatrix} 21 & 0 & 34 \\ 12 & 8 & 23 \\ 34 & 0 & 55 \end{pmatrix} - \begin{pmatrix} 30 & 0 & 48 \\ 12 & 24 & 30 \\ 48 & 0 & 78 \end{pmatrix} + \begin{pmatrix} 7 & 0 & 14 \\ 0 & 14 & 7 \\ 14 & 0 & 21 \end{pmatrix} + \begin{pmatrix} 2 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 2 \end{pmatrix}$

$A^3 - 6A^2 + 7A + 2I = \begin{pmatrix} (21-30+7+2) & (0-0+0+0) & (34-48+14+0) \\ (12-12+0+0) & (8-24+14+2) & (23-30+7+0) \\ (34-48+14+0) & (0-0+0+0) & (55-78+21+2) \end{pmatrix}$

$A^3 - 6A^2 + 7A + 2I = \begin{pmatrix} (2) & (0) & (0) \\ (0) & (0) & (0) \\ (0) & (0) & (0) \end{pmatrix}$

$A^3 - 6A^2 + 7A + 2I = \begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}$


The result is the $3 \times 3$ zero matrix, $O$.

Therefore, $A^3 - 6A^2 + 7A + 2I = O$.

Hence, the identity is proved.

Question 4. Using elementary row transformations, find the inverse of the matrix $A = \begin{pmatrix} 1 & 2 \\ 2 & -1 \end{pmatrix}$.

Answer:

Given:

Matrix $A = \begin{pmatrix} 1 & 2 \\ 2 & -1 \end{pmatrix}$.


To Find:

The inverse of matrix $A$, denoted by $A^{-1}$, using elementary row transformations.


Solution:

We write the augmented matrix $[A | I]$, where $I$ is the identity matrix of the same order as $A$ ($2 \times 2$).

$[A | I] = \begin{pmatrix} 1 & 2 & | & 1 & 0 \\ 2 & -1 & | & 0 & 1 \end{pmatrix}$


Our goal is to transform the left side of this augmented matrix into the identity matrix by applying elementary row operations to the entire matrix. The matrix on the right side will then become the inverse of $A$.


Step 1: Make the element in the second row, first column zero. Apply the operation $R_2 \to R_2 - 2R_1$.

The new Row 2 is calculated as: $(2, -1) - 2 \times (1, 2) = (2 - 2\times1, -1 - 2\times2) = (2 - 2, -1 - 4) = (0, -5)$.

Applying the same operation to the right side of the augmented matrix:

The new elements in the second row on the right are: $(0, 1) - 2 \times (1, 0) = (0 - 2\times1, 1 - 2\times0) = (0 - 2, 1 - 0) = (-2, 1)$.

The augmented matrix becomes:

$\begin{pmatrix} 1 & 2 & | & 1 & 0 \\ 0 & -5 & | & -2 & 1 \end{pmatrix}$


Step 2: Make the diagonal element in the second row equal to 1. Apply the operation $R_2 \to -\frac{1}{5}R_2$.

The new Row 2 is calculated as: $-\frac{1}{5} \times (0, -5) = (-\frac{1}{5} \times 0, -\frac{1}{5} \times -5) = (0, 1)$.

Applying the same operation to the right side of the augmented matrix:

The new elements in the second row on the right are: $-\frac{1}{5} \times (-2, 1) = (-\frac{1}{5} \times -2, -\frac{1}{5} \times 1) = (\frac{2}{5}, -\frac{1}{5})$.

The augmented matrix becomes:

$\begin{pmatrix} 1 & 2 & | & 1 & 0 \\ 0 & 1 & | & \frac{2}{5} & -\frac{1}{5} \end{pmatrix}$


Step 3: Make the element in the first row, second column zero. Apply the operation $R_1 \to R_1 - 2R_2$.

The new Row 1 is calculated as: $(1, 2) - 2 \times (0, 1) = (1 - 2\times0, 2 - 2\times1) = (1 - 0, 2 - 2) = (1, 0)$.

Applying the same operation to the right side of the augmented matrix:

The new elements in the first row on the right are: $(1, 0) - 2 \times (\frac{2}{5}, -\frac{1}{5}) = (1 - 2\times\frac{2}{5}, 0 - 2\times(-\frac{1}{5})) = (1 - \frac{4}{5}, 0 + \frac{2}{5}) = (\frac{5-4}{5}, \frac{2}{5}) = (\frac{1}{5}, \frac{2}{5})$.

The augmented matrix becomes:

$\begin{pmatrix} 1 & 0 & | & \frac{1}{5} & \frac{2}{5} \\ 0 & 1 & | & \frac{2}{5} & -\frac{1}{5} \end{pmatrix}$


The left side of the augmented matrix is now the identity matrix $I$. The matrix on the right side is the inverse of $A$.

Therefore, the inverse of matrix $A$ is:

$A^{-1} = \begin{pmatrix} \frac{1}{5} & \frac{2}{5} \\ \frac{2}{5} & -\frac{1}{5} \end{pmatrix}$

Question 5. If $A = \begin{pmatrix} \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{pmatrix}$, prove that $A^n = \begin{pmatrix} \cos n\theta & \sin n\theta \\ -\sin n\theta & \cos n\theta \end{pmatrix}$ for all positive integers $n$, using the Principle of Mathematical Induction.

Answer:

Given:

Matrix $A = \begin{pmatrix} \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{pmatrix}$.


To Prove:

$A^n = \begin{pmatrix} \cos n\theta & \sin n\theta \\ -\sin n\theta & \cos n\theta \end{pmatrix}$ for all positive integers $n$, using the Principle of Mathematical Induction.


Proof:

Let the given statement be $P(n): A^n = \begin{pmatrix} \cos n\theta & \sin n\theta \\ -\sin n\theta & \cos n\theta \end{pmatrix}$.


Base Case (P(1)):

We need to show that $P(1)$ is true. For $n=1$, the statement becomes:

$A^1 = \begin{pmatrix} \cos (1)\theta & \sin (1)\theta \\ -\sin (1)\theta & \cos (1)\theta \end{pmatrix} = \begin{pmatrix} \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{pmatrix}$

This is the given matrix $A$. Thus, $A^1 = A$, and $P(1)$ is true.


Inductive Hypothesis (Assume P(k)):

Assume that the statement $P(k)$ is true for some arbitrary positive integer $k$. That is,

$A^k = \begin{pmatrix} \cos k\theta & \sin k\theta \\ -\sin k\theta & \cos k\theta \end{pmatrix}$


Inductive Step (Prove P(k+1)):

We need to show that $P(k+1)$ is true, i.e., $A^{k+1} = \begin{pmatrix} \cos (k+1)\theta & \sin (k+1)\theta \\ -\sin (k+1)\theta & \cos (k+1)\theta \end{pmatrix}$.

We know that $A^{k+1} = A^k \times A$.

Using the inductive hypothesis for $A^k$ and the given matrix $A$, we have:

$A^{k+1} = \begin{pmatrix} \cos k\theta & \sin k\theta \\ -\sin k\theta & \cos k\theta \end{pmatrix} \begin{pmatrix} \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{pmatrix}$


Now, we perform the matrix multiplication:

The element in the first row, first column is:

$(\cos k\theta)(\cos \theta) + (\sin k\theta)(-\sin \theta) = \cos k\theta \cos \theta - \sin k\theta \sin \theta$

Using the trigonometric identity $\cos(A+B) = \cos A \cos B - \sin A \sin B$, with $A=k\theta$ and $B=\theta$, this simplifies to $\cos(k\theta + \theta) = \cos((k+1)\theta)$.


The element in the first row, second column is:

$(\cos k\theta)(\sin \theta) + (\sin k\theta)(\cos \theta) = \cos k\theta \sin \theta + \sin k\theta \cos \theta$

Using the trigonometric identity $\sin(A+B) = \sin A \cos B + \cos A \sin B$, with $A=k\theta$ and $B=\theta$, this simplifies to $\sin(k\theta + \theta) = \sin((k+1)\theta)$.


The element in the second row, first column is:

$(-\sin k\theta)(\cos \theta) + (\cos k\theta)(-\sin \theta) = -\sin k\theta \cos \theta - \cos k\theta \sin \theta$

$= -(\sin k\theta \cos \theta + \cos k\theta \sin \theta)$

Using the trigonometric identity $\sin(A+B) = \sin A \cos B + \cos A \sin B$, with $A=k\theta$ and $B=\theta$, this simplifies to $-\sin(k\theta + \theta) = -\sin((k+1)\theta)$.


The element in the second row, second column is:

$(-\sin k\theta)(\sin \theta) + (\cos k\theta)(\cos \theta) = -\sin k\theta \sin \theta + \cos k\theta \cos \theta$

$= \cos k\theta \cos \theta - \sin k\theta \sin \theta$

Using the trigonometric identity $\cos(A+B) = \cos A \cos B - \sin A \sin B$, with $A=k\theta$ and $B=\theta$, this simplifies to $\cos(k\theta + \theta) = \cos((k+1)\theta)$.


So, the product matrix $A^{k+1}$ is:

$A^{k+1} = \begin{pmatrix} \cos((k+1)\theta) & \sin((k+1)\theta) \\ -\sin((k+1)\theta) & \cos((k+1)\theta) \end{pmatrix}$


This is the statement $P(k+1)$.

Since we have shown that $P(1)$ is true and that if $P(k)$ is true, then $P(k+1)$ is also true, by the Principle of Mathematical Induction, the statement $P(n)$ is true for all positive integers $n$.


Thus, it is proved that $A^n = \begin{pmatrix} \cos n\theta & \sin n\theta \\ -\sin n\theta & \cos n\theta \end{pmatrix}$ for all positive integers $n$.

Question 6. If $A = \begin{pmatrix} 2 & -3 \\ 3 & 4 \end{pmatrix}$, show that $A^2 - 6A + 17I = O$. Use this result to find $A^{-1}$.

Answer:

Given:

Matrix $A = \begin{pmatrix} 2 & -3 \\ 3 & 4 \end{pmatrix}$.

$I$ is the $2 \times 2$ identity matrix, $I = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$.

$O$ is the $2 \times 2$ zero matrix, $O = \begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}$.


Part 1: Show that $A^2 - 6A + 17I = O$.

First, we calculate $A^2 = A \times A$:

$A^2 = \begin{pmatrix} 2 & -3 \\ 3 & 4 \end{pmatrix} \begin{pmatrix} 2 & -3 \\ 3 & 4 \end{pmatrix}$

$A^2 = \begin{pmatrix} (2)(2)+(-3)(3) & (2)(-3)+(-3)(4) \\ (3)(2)+(4)(3) & (3)(-3)+(4)(4) \end{pmatrix}$

$A^2 = \begin{pmatrix} 4-9 & -6-12 \\ 6+12 & -9+16 \end{pmatrix}$

$A^2 = \begin{pmatrix} -5 & -18 \\ 18 & 7 \end{pmatrix}$


Next, we calculate $6A$:

$6A = 6 \times \begin{pmatrix} 2 & -3 \\ 3 & 4 \end{pmatrix}$

$6A = \begin{pmatrix} 6 \times 2 & 6 \times (-3) \\ 6 \times 3 & 6 \times 4 \end{pmatrix}$

$6A = \begin{pmatrix} 12 & -18 \\ 18 & 24 \end{pmatrix}$


Next, we calculate $17I$:

$17I = 17 \times \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$

$17I = \begin{pmatrix} 17 \times 1 & 17 \times 0 \\ 17 \times 0 & 17 \times 1 \end{pmatrix}$

$17I = \begin{pmatrix} 17 & 0 \\ 0 & 17 \end{pmatrix}$


Now, we calculate $A^2 - 6A + 17I$:

$A^2 - 6A + 17I = \begin{pmatrix} -5 & -18 \\ 18 & 7 \end{pmatrix} - \begin{pmatrix} 12 & -18 \\ 18 & 24 \end{pmatrix} + \begin{pmatrix} 17 & 0 \\ 0 & 17 \end{pmatrix}$


Perform the subtraction $A^2 - 6A$:

$A^2 - 6A = \begin{pmatrix} -5-12 & -18-(-18) \\ 18-18 & 7-24 \end{pmatrix}$

$A^2 - 6A = \begin{pmatrix} -17 & 0 \\ 0 & -17 \end{pmatrix}$


Now, perform the addition $(A^2 - 6A) + 17I$:

$(A^2 - 6A) + 17I = \begin{pmatrix} -17 & 0 \\ 0 & -17 \end{pmatrix} + \begin{pmatrix} 17 & 0 \\ 0 & 17 \end{pmatrix}$

$(A^2 - 6A) + 17I = \begin{pmatrix} -17+17 & 0+0 \\ 0+0 & -17+17 \end{pmatrix}$

$(A^2 - 6A) + 17I = \begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}$


The result is the zero matrix $O$. Thus, $A^2 - 6A + 17I = O$ is shown.


Part 2: Use the result $A^2 - 6A + 17I = O$ to find $A^{-1}$.

We have the equation $A^2 - 6A + 17I = O$.

To find the inverse $A^{-1}$, we can pre-multiply or post-multiply the equation by $A^{-1}$. Let's pre-multiply by $A^{-1}$ (assuming $A$ is invertible).

$A^{-1}(A^2 - 6A + 17I) = A^{-1}O$

Using the distributive property of matrix multiplication:

$A^{-1}A^2 - A^{-1}6A + A^{-1}17I = O$

Using the properties $A^{-1}A = I$, $IA = A$, $AI = A$, $A^2 = AA$, and $A^{-1}O = O$:

$A^{-1}(AA) - 6(A^{-1}A) + 17(A^{-1}I) = O$

$(A^{-1}A)A - 6I + 17A^{-1} = O$

$IA - 6I + 17A^{-1} = O$

$A - 6I + 17A^{-1} = O$


Now, we can solve for $A^{-1}$:

$17A^{-1} = 6I - A$

$A^{-1} = \frac{1}{17}(6I - A)$


Substitute the matrices for $I$ and $A$:

$A^{-1} = \frac{1}{17} \left( 6 \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} - \begin{pmatrix} 2 & -3 \\ 3 & 4 \end{pmatrix} \right)$

$A^{-1} = \frac{1}{17} \left( \begin{pmatrix} 6 & 0 \\ 0 & 6 \end{pmatrix} - \begin{pmatrix} 2 & -3 \\ 3 & 4 \end{pmatrix} \right)$

$A^{-1} = \frac{1}{17} \begin{pmatrix} 6-2 & 0-(-3) \\ 0-3 & 6-4 \end{pmatrix}$

$A^{-1} = \frac{1}{17} \begin{pmatrix} 4 & 3 \\ -3 & 2 \end{pmatrix}$


Distribute the scalar $\frac{1}{17}$:

$A^{-1} = \begin{pmatrix} \frac{4}{17} & \frac{3}{17} \\ -\frac{3}{17} & \frac{2}{17} \end{pmatrix}$


The inverse of matrix $A$ is $\begin{pmatrix} \frac{4}{17} & \frac{3}{17} \\ -\frac{3}{17} & \frac{2}{17} \end{pmatrix}$.

Question 7. Using elementary column transformations, find the inverse of the matrix $A = \begin{pmatrix} 1 & 3 \\ 2 & 7 \end{pmatrix}$.

Answer:

Given:

Matrix $A = \begin{pmatrix} 1 & 3 \\ 2 & 7 \end{pmatrix}$.


To Find:

The inverse of matrix $A$, denoted by $A^{-1}$, using elementary column transformations.


Solution:

To find the inverse using elementary column transformations, we start with the matrix equation $A = IA$.

We apply column operations to the left matrix $A$ to transform it into the identity matrix $I$. The same column operations are applied simultaneously to the pre-multiplying identity matrix $I$ on the right side. When the left side becomes $I$, the matrix on the right side of the identity matrix will be $A^{-1}$.

The equation is: $\begin{pmatrix} 1 & 3 \\ 2 & 7 \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} A$


Let $C_1$ and $C_2$ denote the first and second columns, respectively.

Step 1: Make the element in the first row, second column equal to zero. Apply the operation $C_2 \to C_2 - 3C_1$ to both matrices.

Applying $C_2 \to C_2 - 3C_1$ to the left matrix:

$\begin{pmatrix} 1 & 3 - 3(1) \\ 2 & 7 - 3(2) \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 2 & 1 \end{pmatrix}$

Applying $C_2 \to C_2 - 3C_1$ to the right identity matrix:

$\begin{pmatrix} 1 & 0 - 3(1) \\ 0 & 1 - 3(0) \end{pmatrix} = \begin{pmatrix} 1 & -3 \\ 0 & 1 \end{pmatrix}$

The matrix equation becomes: $\begin{pmatrix} 1 & 0 \\ 2 & 1 \end{pmatrix} = \begin{pmatrix} 1 & -3 \\ 0 & 1 \end{pmatrix} A$


Step 2: Make the element in the second row, first column equal to zero. Apply the operation $C_1 \to C_1 - 2C_2$ to both matrices.

Applying $C_1 \to C_1 - 2C_2$ to the left matrix:

$\begin{pmatrix} 1 - 2(0) & 0 \\ 2 - 2(1) & 1 \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$

Applying $C_1 \to C_1 - 2C_2$ to the right matrix:

$\begin{pmatrix} 1 - 2(-3) & -3 \\ 0 - 2(1) & 1 \end{pmatrix} = \begin{pmatrix} 1 + 6 & -3 \\ 0 - 2 & 1 \end{pmatrix} = \begin{pmatrix} 7 & -3 \\ -2 & 1 \end{pmatrix}$

The matrix equation becomes: $\begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} = \begin{pmatrix} 7 & -3 \\ -2 & 1 \end{pmatrix} A$


The left side of the equation is now the identity matrix $I$. The matrix on the right side of $A$ is the inverse of $A$.

Therefore, the inverse of matrix $A$ is:

$A^{-1} = \begin{pmatrix} 7 & -3 \\ -2 & 1 \end{pmatrix}$

Question 8. Solve the system of linear equations using matrix multiplication: $2x - y = 5$ and $3x + 2y = 4$.

Answer:

Given:

The system of linear equations:

$2x - y = 5$

$3x + 2y = 4$


To Solve:

Find the values of $x$ and $y$ using matrix multiplication (matrix inverse method).


Solution:

We can write the given system of linear equations in matrix form as $AX = B$, where:

$A = \begin{pmatrix} 2 & -1 \\ 3 & 2 \end{pmatrix}$ (Coefficient matrix)

$X = \begin{pmatrix} x \\ y \end{pmatrix}$ (Variable matrix)

$B = \begin{pmatrix} 5 \\ 4 \end{pmatrix}$ (Constant matrix)


The matrix equation is $\begin{pmatrix} 2 & -1 \\ 3 & 2 \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix} = \begin{pmatrix} 5 \\ 4 \end{pmatrix}$.

To solve for $X$, we need to find the inverse of matrix $A$, $A^{-1}$, such that $X = A^{-1}B$.


First, find the determinant of matrix $A$ to check if the inverse exists.

$\text{det}(A) = (2)(2) - (-1)(3) = 4 - (-3) = 4 + 3 = 7$

Since $\text{det}(A) = 7 \neq 0$, matrix $A$ is non-singular and its inverse exists.


Next, find the adjoint of matrix $A$. For a $2 \times 2$ matrix $\begin{pmatrix} a & b \\ c & d \end{pmatrix}$, the adjoint is $\begin{pmatrix} d & -b \\ -c & a \end{pmatrix}$.

$\text{adj}(A) = \begin{pmatrix} 2 & -(-1) \\ -3 & 2 \end{pmatrix} = \begin{pmatrix} 2 & 1 \\ -3 & 2 \end{pmatrix}$


Now, find the inverse of matrix $A$ using the formula $A^{-1} = \frac{1}{\text{det}(A)}\text{adj}(A)$.

$A^{-1} = \frac{1}{7} \begin{pmatrix} 2 & 1 \\ -3 & 2 \end{pmatrix}$

$A^{-1} = \begin{pmatrix} \frac{2}{7} & \frac{1}{7} \\ -\frac{3}{7} & \frac{2}{7} \end{pmatrix}$


Finally, find the matrix $X$ by multiplying $A^{-1}$ by $B$:

$X = A^{-1}B = \begin{pmatrix} \frac{2}{7} & \frac{1}{7} \\ -\frac{3}{7} & \frac{2}{7} \end{pmatrix} \begin{pmatrix} 5 \\ 4 \end{pmatrix}$

$X = \begin{pmatrix} (\frac{2}{7})(5) + (\frac{1}{7})(4) \\ (-\frac{3}{7})(5) + (\frac{2}{7})(4) \end{pmatrix}$

$X = \begin{pmatrix} \frac{10}{7} + \frac{4}{7} \\ -\frac{15}{7} + \frac{8}{7} \end{pmatrix}$

$X = \begin{pmatrix} \frac{10+4}{7} \\ \frac{-15+8}{7} \end{pmatrix}$

$X = \begin{pmatrix} \frac{14}{7} \\ \frac{-7}{7} \end{pmatrix}$

$X = \begin{pmatrix} 2 \\ -1 \end{pmatrix}$


Since $X = \begin{pmatrix} x \\ y \end{pmatrix}$, by equating the elements, we get:

$x = 2$

$y = -1$


The solution to the system of linear equations is $x=2$ and $y=-1$.

Question 9. If $A = \begin{pmatrix} 0 & 2 & 1 \\ -2 & 0 & 3 \\ -1 & -3 & 0 \end{pmatrix}$, show that A is a skew-symmetric matrix. Also, verify that $(A')' = A$.

Answer:

Given:

Matrix $A = \begin{pmatrix} 0 & 2 & 1 \\ -2 & 0 & 3 \\ -1 & -3 & 0 \end{pmatrix}$.


To Show:

1. $A$ is a skew-symmetric matrix.

2. $(A')' = A$.


Solution:


Part 1: Show that A is a skew-symmetric matrix.

A square matrix $A$ is skew-symmetric if $A' = -A$.

First, find the transpose of matrix $A$, denoted by $A'$. To find the transpose, interchange the rows and columns of $A$.

$A' = \begin{pmatrix} 0 & 2 & 1 \\ -2 & 0 & 3 \\ -1 & -3 & 0 \end{pmatrix}' = \begin{pmatrix} 0 & -2 & -1 \\ 2 & 0 & -3 \\ 1 & 3 & 0 \end{pmatrix}$


Next, find the negative of the matrix $A$, $-A$.

$-A = -1 \times \begin{pmatrix} 0 & 2 & 1 \\ -2 & 0 & 3 \\ -1 & -3 & 0 \end{pmatrix} = \begin{pmatrix} -1 \times 0 & -1 \times 2 & -1 \times 1 \\ -1 \times (-2) & -1 \times 0 & -1 \times 3 \\ -1 \times (-1) & -1 \times (-3) & -1 \times 0 \end{pmatrix}$

$-A = \begin{pmatrix} 0 & -2 & -1 \\ 2 & 0 & -3 \\ 1 & 3 & 0 \end{pmatrix}$


Comparing $A'$ and $-A$, we see that:

$A' = \begin{pmatrix} 0 & -2 & -1 \\ 2 & 0 & -3 \\ 1 & 3 & 0 \end{pmatrix}$

$-A = \begin{pmatrix} 0 & -2 & -1 \\ 2 & 0 & -3 \\ 1 & 3 & 0 \end{pmatrix}$


Since $A' = -A$, the matrix $A$ is a skew-symmetric matrix. This is shown.


Part 2: Verify that $(A')' = A$.

We already found the transpose of $A$, $A' = \begin{pmatrix} 0 & -2 & -1 \\ 2 & 0 & -3 \\ 1 & 3 & 0 \end{pmatrix}$.


Now, we find the transpose of $A'$, denoted by $(A')'$. To find the transpose of $A'$, we interchange the rows and columns of $A'$.

$(A')' = \begin{pmatrix} 0 & -2 & -1 \\ 2 & 0 & -3 \\ 1 & 3 & 0 \end{pmatrix}' = \begin{pmatrix} 0 & 2 & 1 \\ -2 & 0 & 3 \\ -1 & -3 & 0 \end{pmatrix}$


Comparing $(A')'$ with the original matrix $A$, we see that:

$(A')' = \begin{pmatrix} 0 & 2 & 1 \\ -2 & 0 & 3 \\ -1 & -3 & 0 \end{pmatrix}$

$A = \begin{pmatrix} 0 & 2 & 1 \\ -2 & 0 & 3 \\ -1 & -3 & 0 \end{pmatrix}$


Since the corresponding elements are equal, we have $(A')' = A$.

Thus, the property $(A')' = A$ is verified for the given matrix $A$.

Question 10. A trust fund has $\textsf{₹} 30000$ that must be invested in two different types of bonds. The first bond pays 5% interest per year, and the second bond pays 7% interest per year. Using matrix multiplication, determine how to divide $\textsf{₹} 30000$ among the two types of bonds if the trust fund must obtain an annual total interest of (i) $\textsf{₹} 1800$ (ii) $\textsf{₹} 2000$.

Answer:

Given:

Total amount to be invested = $\textsf{₹} 30000$.

Interest rate on the first bond = 5% per year.

Interest rate on the second bond = 7% per year.


To Determine:

How to divide $\textsf{₹} 30000$ between the two bonds to get a total annual interest of (i) $\textsf{₹} 1800$ and (ii) $\textsf{₹} 2000$, using matrix multiplication.


Solution:

Let $x$ be the amount invested in the first bond (at 5% interest) and $y$ be the amount invested in the second bond (at 7% interest).

From the problem, we have two equations based on the total amount and the total interest.

Equation 1 (Total amount): $x + y = 30000$

Equation 2 (Total interest): $5\% \text{ of } x + 7\% \text{ of } y = \text{Total Interest}$

Which can be written as: $0.05x + 0.07y = \text{Total Interest}$


We can write this system of linear equations in matrix form $AZ = B$, where:

$A = \begin{pmatrix} 1 & 1 \\ 0.05 & 0.07 \end{pmatrix}$ (Coefficient matrix)

$Z = \begin{pmatrix} x \\ y \end{pmatrix}$ (Variable matrix)

$B = \begin{pmatrix} 30000 \\ \text{Total Interest} \end{pmatrix}$ (Constant matrix)

The matrix equation is $\begin{pmatrix} 1 & 1 \\ 0.05 & 0.07 \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix} = \begin{pmatrix} 30000 \\ \text{Total Interest} \end{pmatrix}$.


To solve for the variable matrix $Z$, we need to find the inverse of matrix $A$, $A^{-1}$, using the formula $Z = A^{-1}B$.

First, calculate the determinant of matrix $A$:

$\text{det}(A) = (1)(0.07) - (1)(0.05) = 0.07 - 0.05 = 0.02$

Since $\text{det}(A) = 0.02 \neq 0$, the matrix $A$ is invertible.


Next, find the adjoint of matrix $A$. For a $2 \times 2$ matrix $\begin{pmatrix} a & b \\ c & d \end{pmatrix}$, the adjoint is $\begin{pmatrix} d & -b \\ -c & a \end{pmatrix}$.

$\text{adj}(A) = \begin{pmatrix} 0.07 & -1 \\ -0.05 & 1 \end{pmatrix}$


Now, find the inverse of matrix $A$ using $A^{-1} = \frac{1}{\text{det}(A)}\text{adj}(A)$.

$A^{-1} = \frac{1}{0.02} \begin{pmatrix} 0.07 & -1 \\ -0.05 & 1 \end{pmatrix} = 50 \begin{pmatrix} 0.07 & -1 \\ -0.05 & 1 \end{pmatrix}$

$A^{-1} = \begin{pmatrix} 50 \times 0.07 & 50 \times -1 \\ 50 \times -0.05 & 50 \times 1 \end{pmatrix} = \begin{pmatrix} 3.5 & -50 \\ -2.5 & 50 \end{pmatrix}$


Now we use $Z = A^{-1}B$ for each case of total interest.


Case (i): Total Annual Interest = $\textsf{₹} 1800$

In this case, $B = \begin{pmatrix} 30000 \\ 1800 \end{pmatrix}$.

$Z = \begin{pmatrix} x \\ y \end{pmatrix} = A^{-1}B = \begin{pmatrix} 3.5 & -50 \\ -2.5 & 50 \end{pmatrix} \begin{pmatrix} 30000 \\ 1800 \end{pmatrix}$

$x = (3.5)(30000) + (-50)(1800) = 105000 - 90000 = 15000$

$y = (-2.5)(30000) + (50)(1800) = -75000 + 90000 = 15000$

So, $x = 15000$ and $y = 15000$.

For a total annual interest of $\textsf{₹} 1800$, $\textsf{₹} 15000$ should be invested in the first bond and $\textsf{₹} 15000$ should be invested in the second bond.


Case (ii): Total Annual Interest = $\textsf{₹} 2000$

In this case, $B = \begin{pmatrix} 30000 \\ 2000 \end{pmatrix}$.

$Z = \begin{pmatrix} x \\ y \end{pmatrix} = A^{-1}B = \begin{pmatrix} 3.5 & -50 \\ -2.5 & 50 \end{pmatrix} \begin{pmatrix} 30000 \\ 2000 \end{pmatrix}$

$x = (3.5)(30000) + (-50)(2000) = 105000 - 100000 = 5000$

$y = (-2.5)(30000) + (50)(2000) = -75000 + 100000 = 25000$

So, $x = 5000$ and $y = 25000$.

For a total annual interest of $\textsf{₹} 2000$, $\textsf{₹} 5000$ should be invested in the first bond and $\textsf{₹} 25000$ should be invested in the second bond.

Question 11. Using elementary row operations, find the inverse of the matrix $A = \begin{pmatrix} 3 & -1 & -2 \\ 2 & 0 & -1 \\ 3 & -5 & 0 \end{pmatrix}$.

Answer:

Given:

Matrix $A = \begin{pmatrix} 3 & -1 & -2 \\ 2 & 0 & -1 \\ 3 & -5 & 0 \end{pmatrix}$.


To Find:

The inverse of matrix $A$, denoted by $A^{-1}$, using elementary row transformations.


Solution:

We write the augmented matrix $[A | I]$, where $I$ is the identity matrix of the same order as $A$ ($3 \times 3$).

$[A | I] = \begin{pmatrix} 3 & -1 & -2 & | & 1 & 0 & 0 \\ 2 & 0 & -1 & | & 0 & 1 & 0 \\ 3 & -5 & 0 & | & 0 & 0 & 1 \end{pmatrix}$


Our goal is to transform the left side of this augmented matrix into the identity matrix by applying elementary row operations to the entire matrix. The matrix on the right side will then become the inverse of $A$.


Step 1: Make the element at (1,1) equal to 1. Apply $R_1 \to \frac{1}{3}R_1$.

$\begin{pmatrix} 1 & -\frac{1}{3} & -\frac{2}{3} & | & \frac{1}{3} & 0 & 0 \\ 2 & 0 & -1 & | & 0 & 1 & 0 \\ 3 & -5 & 0 & | & 0 & 0 & 1 \end{pmatrix}$


Step 2: Make the elements below (1,1) equal to 0. Apply $R_2 \to R_2 - 2R_1$ and $R_3 \to R_3 - 3R_1$.

$R_2 \to R_2 - 2R_1$: $(2, 0, -1) - 2(1, -1/3, -2/3) = (0, 2/3, 1/3)$. Right side: $(0, 1, 0) - 2(1/3, 0, 0) = (-2/3, 1, 0)$.

$R_3 \to R_3 - 3R_1$: $(3, -5, 0) - 3(1, -1/3, -2/3) = (0, -4, 2)$. Right side: $(0, 0, 1) - 3(1/3, 0, 0) = (-1, 0, 1)$.

$\begin{pmatrix} 1 & -\frac{1}{3} & -\frac{2}{3} & | & \frac{1}{3} & 0 & 0 \\ 0 & \frac{2}{3} & \frac{1}{3} & | & -\frac{2}{3} & 1 & 0 \\ 0 & -4 & 2 & | & -1 & 0 & 1 \end{pmatrix}$


Step 3: Make the element at (2,2) equal to 1. Apply $R_2 \to \frac{3}{2}R_2$.

$R_2 \to \frac{3}{2}R_2$: $\frac{3}{2}(0, 2/3, 1/3) = (0, 1, 1/2)$. Right side: $\frac{3}{2}(-2/3, 1, 0) = (-1, 3/2, 0)$.

$\begin{pmatrix} 1 & -\frac{1}{3} & -\frac{2}{3} & | & \frac{1}{3} & 0 & 0 \\ 0 & 1 & \frac{1}{2} & | & -1 & \frac{3}{2} & 0 \\ 0 & -4 & 2 & | & -1 & 0 & 1 \end{pmatrix}$


Step 4: Make the elements above and below (2,2) equal to 0. Apply $R_1 \to R_1 + \frac{1}{3}R_2$ and $R_3 \to R_3 + 4R_2$.

$R_1 \to R_1 + \frac{1}{3}R_2$: $(1, -1/3, -2/3) + (0, 1/3, 1/6) = (1, 0, -4/6+1/6) = (1, 0, -3/6) = (1, 0, -1/2)$. Right side: $(1/3, 0, 0) + (-1/3, 1/2, 0) = (0, 1/2, 0)$.

$R_3 \to R_3 + 4R_2$: $(0, -4, 2) + 4(0, 1, 1/2) = (0, 0, 4)$. Right side: $(-1, 0, 1) + 4(-1, 3/2, 0) = (-1-4, 0+6, 1+0) = (-5, 6, 1)$.

$\begin{pmatrix} 1 & 0 & -\frac{1}{2} & | & 0 & \frac{1}{2} & 0 \\ 0 & 1 & \frac{1}{2} & | & -1 & \frac{3}{2} & 0 \\ 0 & 0 & 4 & | & -5 & 6 & 1 \end{pmatrix}$


Step 5: Make the element at (3,3) equal to 1. Apply $R_3 \to \frac{1}{4}R_3$.

$R_3 \to \frac{1}{4}R_3$: $\frac{1}{4}(0, 0, 4) = (0, 0, 1)$. Right side: $\frac{1}{4}(-5, 6, 1) = (-5/4, 3/2, 1/4)$.

$\begin{pmatrix} 1 & 0 & -\frac{1}{2} & | & 0 & \frac{1}{2} & 0 \\ 0 & 1 & \frac{1}{2} & | & -1 & \frac{3}{2} & 0 \\ 0 & 0 & 1 & | & -\frac{5}{4} & \frac{3}{2} & \frac{1}{4} \end{pmatrix}$


Step 6: Make the elements above (3,3) equal to 0. Apply $R_1 \to R_1 + \frac{1}{2}R_3$ and $R_2 \to R_2 - \frac{1}{2}R_3$.

$R_1 \to R_1 + \frac{1}{2}R_3$: $(1, 0, -1/2) + (0, 0, 1/2) = (1, 0, 0)$. Right side: $(0, 1/2, 0) + (1/2)(-5/4, 3/2, 1/4) = (0 - 5/8, 1/2 + 3/4, 0 + 1/8) = (-5/8, 5/4, 1/8)$.

$R_2 \to R_2 - \frac{1}{2}R_3$: $(0, 1, 1/2) - (0, 0, 1/2) = (0, 1, 0)$. Right side: $(-1, 3/2, 0) - (1/2)(-5/4, 3/2, 1/4) = (-1 + 5/8, 3/2 - 3/4, 0 - 1/8) = (-3/8, 3/4, -1/8)$.

$\begin{pmatrix} 1 & 0 & 0 & | & -\frac{5}{8} & \frac{5}{4} & \frac{1}{8} \\ 0 & 1 & 0 & | & -\frac{3}{8} & \frac{3}{4} & -\frac{1}{8} \\ 0 & 0 & 1 & | & -\frac{5}{4} & \frac{3}{2} & \frac{1}{4} \end{pmatrix}$


The left side of the augmented matrix is now the identity matrix $I$. The matrix on the right side is the inverse of $A$.

Therefore, the inverse of matrix $A$ is:

$A^{-1} = \begin{pmatrix} -\frac{5}{8} & \frac{5}{4} & \frac{1}{8} \\ -\frac{3}{8} & \frac{3}{4} & -\frac{1}{8} \\ -\frac{5}{4} & \frac{3}{2} & \frac{1}{4} \end{pmatrix}$

Question 12. If $A = \begin{pmatrix} 1 & 2 & 3 \\ 2 & 3 & 1 \\ 3 & 1 & 2 \end{pmatrix}$ and $B = \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}$, find $A^2 - 5A + 4B$ and verify if it is a zero matrix.

Answer:

Given:

Matrix $A = \begin{pmatrix} 1 & 2 & 3 \\ 2 & 3 & 1 \\ 3 & 1 & 2 \end{pmatrix}$ and matrix $B = \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}$. Note that $B$ is the $3 \times 3$ identity matrix, $I$.


To Find and Verify:

Find the matrix $A^2 - 5A + 4B$ and verify if it is the zero matrix $O = \begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}$.


Solution:

We need to calculate the expression $A^2 - 5A + 4B$. Since $B=I$, the expression is $A^2 - 5A + 4I$.


First, calculate $A^2 = A \times A$:

$A^2 = \begin{pmatrix} 1 & 2 & 3 \\ 2 & 3 & 1 \\ 3 & 1 & 2 \end{pmatrix} \begin{pmatrix} 1 & 2 & 3 \\ 2 & 3 & 1 \\ 3 & 1 & 2 \end{pmatrix}$

$A^2 = \begin{pmatrix} (1)(1)+(2)(2)+(3)(3) & (1)(2)+(2)(3)+(3)(1) & (1)(3)+(2)(1)+(3)(2) \\ (2)(1)+(3)(2)+(1)(3) & (2)(2)+(3)(3)+(1)(1) & (2)(3)+(3)(1)+(1)(2) \\ (3)(1)+(1)(2)+(2)(3) & (3)(2)+(1)(3)+(2)(1) & (3)(3)+(1)(1)+(2)(2) \end{pmatrix}$

$A^2 = \begin{pmatrix} 1+4+9 & 2+6+3 & 3+2+6 \\ 2+6+3 & 4+9+1 & 6+3+2 \\ 3+2+6 & 6+3+2 & 9+1+4 \end{pmatrix}$

$A^2 = \begin{pmatrix} 14 & 11 & 11 \\ 11 & 14 & 11 \\ 11 & 11 & 14 \end{pmatrix}$


Next, we calculate $5A$:

$5A = 5 \times \begin{pmatrix} 1 & 2 & 3 \\ 2 & 3 & 1 \\ 3 & 1 & 2 \end{pmatrix}$

$5A = \begin{pmatrix} 5 \times 1 & 5 \times 2 & 5 \times 3 \\ 5 \times 2 & 5 \times 3 & 5 \times 1 \\ 5 \times 3 & 5 \times 1 & 5 \times 2 \end{pmatrix}$

$5A = \begin{pmatrix} 5 & 10 & 15 \\ 10 & 15 & 5 \\ 15 & 5 & 10 \end{pmatrix}$


Next, we calculate $4B$ (or $4I$):

$4B = 4I = 4 \times \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}$

$4B = \begin{pmatrix} 4 \times 1 & 4 \times 0 & 4 \times 0 \\ 4 \times 0 & 4 \times 1 & 4 \times 0 \\ 4 \times 0 & 4 \times 0 & 4 \times 1 \end{pmatrix}$

$4B = \begin{pmatrix} 4 & 0 & 0 \\ 0 & 4 & 0 \\ 0 & 0 & 4 \end{pmatrix}$


Now, we calculate $A^2 - 5A + 4B$:

$A^2 - 5A + 4B = \begin{pmatrix} 14 & 11 & 11 \\ 11 & 14 & 11 \\ 11 & 11 & 14 \end{pmatrix} - \begin{pmatrix} 5 & 10 & 15 \\ 10 & 15 & 5 \\ 15 & 5 & 10 \end{pmatrix} + \begin{pmatrix} 4 & 0 & 0 \\ 0 & 4 & 0 \\ 0 & 0 & 4 \end{pmatrix}$


Perform the subtraction $A^2 - 5A$:

$A^2 - 5A = \begin{pmatrix} 14-5 & 11-10 & 11-15 \\ 11-10 & 14-15 & 11-5 \\ 11-15 & 11-5 & 14-10 \end{pmatrix}$

$A^2 - 5A = \begin{pmatrix} 9 & 1 & -4 \\ 1 & -1 & 6 \\ -4 & 6 & 4 \end{pmatrix}$


Now, perform the addition $(A^2 - 5A) + 4B$:

$(A^2 - 5A) + 4B = \begin{pmatrix} 9 & 1 & -4 \\ 1 & -1 & 6 \\ -4 & 6 & 4 \end{pmatrix} + \begin{pmatrix} 4 & 0 & 0 \\ 0 & 4 & 0 \\ 0 & 0 & 4 \end{pmatrix}$

$(A^2 - 5A) + 4B = \begin{pmatrix} 9+4 & 1+0 & -4+0 \\ 1+0 & -1+4 & 6+0 \\ -4+0 & 6+0 & 4+4 \end{pmatrix}$

$(A^2 - 5A) + 4B = \begin{pmatrix} 13 & 1 & -4 \\ 1 & 3 & 6 \\ -4 & 6 & 8 \end{pmatrix}$


The resulting matrix is $\begin{pmatrix} 13 & 1 & -4 \\ 1 & 3 & 6 \\ -4 & 6 & 8 \end{pmatrix}$.


Verification:

We need to verify if the result is a zero matrix $O = \begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}$.

Comparing the calculated matrix $\begin{pmatrix} 13 & 1 & -4 \\ 1 & 3 & 6 \\ -4 & 6 & 8 \end{pmatrix}$ with the zero matrix $\begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}$, we see that the corresponding elements are not all zero.


Therefore, $A^2 - 5A + 4B$ is not a zero matrix for the given matrices $A$ and $B$.